Examen Parcial GLM Sofia Gerard 149721

set.seed(1996) 
library(tidyverse)
── Attaching core tidyverse packages ──────────────────────── tidyverse 2.0.0 ──
✔ dplyr     1.1.2     ✔ readr     2.1.4
✔ forcats   1.0.0     ✔ stringr   1.5.0
✔ ggplot2   3.4.4     ✔ tibble    3.2.1
✔ lubridate 1.9.2     ✔ tidyr     1.3.0
✔ purrr     1.0.2     
── Conflicts ────────────────────────────────────────── tidyverse_conflicts() ──
✖ dplyr::filter() masks stats::filter()
✖ dplyr::lag()    masks stats::lag()
ℹ Use the conflicted package (<http://conflicted.r-lib.org/>) to force all conflicts to become errors
library(dplyr)
library(ggplot2)
library(rstan)
Loading required package: StanHeaders

rstan version 2.32.5 (Stan version 2.32.2)

For execution on a local, multicore CPU with excess RAM we recommend calling
options(mc.cores = parallel::detectCores()).
To avoid recompilation of unchanged Stan programs, we recommend calling
rstan_options(auto_write = TRUE)
For within-chain threading using `reduce_sum()` or `map_rect()` Stan functions,
change `threads_per_chain` option:
rstan_options(threads_per_chain = 1)


Attaching package: 'rstan'

The following object is masked from 'package:tidyr':

    extract
library(rstanarm)
Loading required package: Rcpp
This is rstanarm version 2.32.1
- See https://mc-stan.org/rstanarm/articles/priors for changes to default priors!
- Default priors may change, so it's safest to specify priors, even if equivalent to the defaults.
- For execution on a local, multicore CPU with excess RAM we recommend calling
  options(mc.cores = parallel::detectCores())

Attaching package: 'rstanarm'

The following object is masked from 'package:rstan':

    loo
library(cmdstanr)
This is cmdstanr version 0.6.1
- CmdStanR documentation and vignettes: mc-stan.org/cmdstanr
- CmdStan path: /Users/sofiagerard/.cmdstan/cmdstan-2.33.1
- CmdStan version: 2.33.1

A newer version of CmdStan is available. See ?install_cmdstan() to install it.
To disable this check set option or environment variable CMDSTANR_NO_VER_CHECK=TRUE.
library(rstantools)
This is rstantools version 2.3.1.1
library(nleqslv)
library(bayesplot) 
This is bayesplot version 1.10.0
- Online documentation and vignettes at mc-stan.org/bayesplot
- bayesplot theme set to bayesplot::theme_default()
   * Does _not_ affect other ggplot2 plots
   * See ?bayesplot_theme_set for details on theme setting
library(coda)

Attaching package: 'coda'

The following object is masked from 'package:rstan':

    traceplot

Ejercicio 1:

La expectativa de \(X\) se calcula como:

\[ E[X] = \frac{\alpha}{\alpha + \beta} = 0.6 \]

Lo que nos lleva a la relación entre \(\alpha\) y \(\beta\):

\[ 0.4\alpha = 0.6\beta \]

\[ \frac{2}{3}\alpha = \beta \]

La varianza de \(X\) está dada por:

\[ \text{var}[X] = \frac{\alpha\beta}{(\alpha + \beta)^2(\alpha + \beta + 1)} = 0.042^2 \]

Desarrollando la ecuación, obtenemos:

\[ \left(\frac{2}{3}\alpha\right)^2\left(\frac{5}{3}\alpha + 1\right) = 0.042^2 \]

\[ \frac{2}{3}\alpha^2 = (0.042)^2\left(\frac{5}{3^2}\alpha^2\right)\left(\frac{5}{3}\alpha + 1\right) \]

\[ \frac{2}{3}\alpha^2 = (0.042)^2\left(\frac{5^3}{3^3}\alpha^3 + \frac{5^2}{3^2}\alpha\right) \]

\[ \frac{2}{3}\alpha^2 = 0.042^2\left(\frac{5^3}{3^3}\alpha^3\right) \]

Finalmente, asumiendo que \(\alpha \neq 0\):

\[ \alpha = \frac{\frac{2}{3} - (0.042)^2\left(\frac{5^2}{3^2}\right)}{0.042^2\left(\frac{5^3}{3^3}\right)} \approx 89.4 \]

\[ \beta = \frac{2}{3}\alpha \approx 59.6 \]

# μ = α / (α + β) 
# σ² = αβ / ((α + β)²(α + β + 1))

# Datos dados
mu <- 0.60  # Tasa promedio
sigma <- 0.04  # Desviación estándar

# Calcular la varianza
variance <- sigma^2

# Calcular los parámetros alpha y beta usando las fórmulas
alpha <- (mu * (1 - mu) / variance - 1) * mu
beta <- alpha * (1 / mu - 1)

# Mostrar los valores de alpha y beta
print(paste("Alpha:", alpha))
[1] "Alpha: 89.4"
print(paste("Beta:", beta))
[1] "Beta: 59.6"
# Graficar la distribución Beta
curve(dbeta(x, shape1 = alpha, shape2 = beta), 
      from = 0, to = 1, 
      main = "Distribución Beta", 
      xlab = "x", 
      ylab = "Densidad",
      col = "blue")

Inciso b

# Distribución Normal transformada para conocimiento inicial 

mean = mu

# Graficar la distribución normal
x <- seq(0, 1, length.out = 100)
y <- dnorm(x, mean = mean, sd = sqrt(variance))
plot(x, y, type = "l", col = "blue", lwd = 2, 
     main = paste("Distribución normal(", mean, ",", variance, ")"),
     xlab = "x", ylab = "Densidad de probabilidad")

Inciso c

# Inicial de referencia no informativa para theta

x <- seq(0, 1, length.out = 100)
y <- rep(1, 100)

plot(x, y, type = "l", col = "blue", lwd = 2, 
     main = expression(paste(theta, "  Unif(0,1)")),
     xlab = "x", ylab = "Densidad de probabilidad")

Inciso d

# Modelo a

# Datos del año 2024
datos <- list(solicitados = 100, otorgados = 50)

# Distribución inicial del inciso (a) - Distribución Beta
# Parámetros alpha y beta iniciales
alpha_prior <- 89.40
beta_prior <- 59.60

# Combinar datos y parámetros iniciales en una lista para Stan
datos_stan1 <- list(solicitados = datos$solicitados, otorgados = datos$otorgados, alpha_prior = alpha_prior, beta_prior = beta_prior)

# Especifica la ruta del archivo Stan
ruta_archivo_stan1 <- "Ej1-modeloa.stan"

# Compilar el modelo Stan desde el archivo
modelo_stan1 <- stan_model(file = ruta_archivo_stan1)

# Ejecutar el modelo Stan
resultados_stan1 <- sampling(modelo_stan1, data = datos_stan1, chains = 4, iter = 10000)

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 7e-06 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.07 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 1: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 1: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 1: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 1: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 1: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 1: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 1: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 1: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 1: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 1: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 1: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.024 seconds (Warm-up)
Chain 1:                0.024 seconds (Sampling)
Chain 1:                0.048 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 1e-06 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.01 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 2: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 2: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 2: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 2: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 2: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 2: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 2: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 2: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 2: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 2: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 2: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.024 seconds (Warm-up)
Chain 2:                0.024 seconds (Sampling)
Chain 2:                0.048 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 2e-06 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.02 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 3: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 3: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 3: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 3: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 3: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 3: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 3: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 3: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 3: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 3: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 3: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.024 seconds (Warm-up)
Chain 3:                0.024 seconds (Sampling)
Chain 3:                0.048 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 0 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 4: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 4: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 4: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 4: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 4: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 4: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 4: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 4: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 4: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 4: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 4: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.024 seconds (Warm-up)
Chain 4:                0.028 seconds (Sampling)
Chain 4:                0.052 seconds (Total)
Chain 4: 
# Resumen de los resultados
print(summary(resultados_stan1))
$summary
              mean      se_mean         sd         2.5%        25%          50%
theta    0.5598829 0.0003611676 0.03130911    0.4981931    0.53871    0.5600657
lp__  -171.3044964 0.0076512481 0.71141617 -173.3049855 -171.46481 -171.0312931
               75%        97.5%    n_eff     Rhat
theta    0.5809936    0.6209651 7514.914 1.000267
lp__  -170.8562856 -170.8066965 8645.356 1.000627

$c_summary
, , chains = chain:1

         stats
parameter         mean         sd         2.5%          25%         50%
    theta    0.5607619 0.03179211    0.4987833    0.5390875    0.560951
    lp__  -171.3208836 0.72764263 -173.3505710 -171.4853208 -171.041632
         stats
parameter          75%        97.5%
    theta    0.5819671    0.6237532
    lp__  -170.8592865 -170.8066313

, , chains = chain:2

         stats
parameter        mean         sd         2.5%          25%          50%
    theta    0.559288 0.03114069    0.4976092    0.5385063    0.5598339
    lp__  -171.299152 0.72579921 -173.3817487 -171.4551154 -171.0229267
         stats
parameter          75%       97.5%
    theta    0.5797235    0.619474
    lp__  -170.8520292 -170.806719

, , chains = chain:3

         stats
parameter         mean         sd         2.5%          25%          50%
    theta    0.5592326 0.03197512    0.4967433    0.5376002    0.5593093
    lp__  -171.3256502 0.73574812 -173.3263187 -171.5041545 -171.0483142
         stats
parameter          75%        97.5%
    theta    0.5811719    0.6191967
    lp__  -170.8615480 -170.8068330

, , chains = chain:4

         stats
parameter         mean         sd       2.5%          25%          50%
    theta    0.5602492 0.03028295    0.50042    0.5396215    0.5604194
    lp__  -171.2722995 0.65222727 -173.11986 -171.4200041 -171.0174413
         stats
parameter          75%       97.5%
    theta    0.5804281    0.619498
    lp__  -170.8527325 -170.806685
# Extraer muestras de theta 
muestras_theta1 <- extract(resultados_stan1)$theta

# Graficar las posteriores de theta
hist(muestras_theta1, breaks = 30, main = "Posterior de theta con Beta", xlab = "theta", ylab = "Densidad", col = "skyblue")

# Modelo b 

# Datos del año 2024
datos <- list(solicitados = 100, otorgados = 50)

# Distribución inicial del inciso (b) - Distribución Normal Modificada (0.6, 0.0016)
# Parámetros mu y sigma iniciales
mu_prior <- 0.6
sigma_prior <- 0.0016

# Combinar datos y parámetros iniciales en una lista para Stan
datos_stan2 <- list(solicitados = datos$solicitados, otorgados = datos$otorgados, mu_prior = mu_prior, sigma_prior = sigma_prior)

# Especifica la ruta del archivo Stan
ruta_archivo_stan2 <- "Ej1-modelob.stan"

# Compilar el modelo Stan desde el archivo
modelo_stan2 <- stan_model(file = ruta_archivo_stan2)

# Ejecutar el modelo Stan
resultados_stan2 <- sampling(modelo_stan2, data = datos_stan2, chains = 4, iter = 10000)

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 6e-06 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.06 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 1: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 1: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 1: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 1: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 1: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 1: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 1: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 1: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 1: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 1: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 1: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.025 seconds (Warm-up)
Chain 1:                0.027 seconds (Sampling)
Chain 1:                0.052 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 1e-06 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.01 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 2: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 2: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 2: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 2: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 2: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 2: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 2: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 2: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 2: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 2: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 2: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.025 seconds (Warm-up)
Chain 2:                0.025 seconds (Sampling)
Chain 2:                0.05 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 0 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 3: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 3: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 3: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 3: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 3: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 3: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 3: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 3: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 3: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 3: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 3: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.025 seconds (Warm-up)
Chain 3:                0.025 seconds (Sampling)
Chain 3:                0.05 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 0 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 4: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 4: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 4: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 4: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 4: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 4: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 4: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 4: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 4: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 4: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 4: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.025 seconds (Warm-up)
Chain 4:                0.027 seconds (Sampling)
Chain 4:                0.052 seconds (Total)
Chain 4: 
# Resumen de los resultados
print(summary(resultados_stan2))
$summary
             mean      se_mean          sd        2.5%         25%         50%
theta   0.5999044 1.858194e-05 0.001604577   0.5967629   0.5988209   0.5999162
lp__  -73.2840901 7.540486e-03 0.713098208 -75.2580460 -73.4497374 -73.0114587
              75%       97.5%    n_eff     Rhat
theta   0.6009945   0.6030125 7456.567 1.000517
lp__  -72.8299809 -72.7810659 8943.345 1.000366

$c_summary
, , chains = chain:1

         stats
parameter        mean          sd        2.5%         25%         50%
    theta   0.5999072 0.001638477   0.5966747   0.5988086   0.5999372
    lp__  -73.3055247 0.731494202 -75.3929810 -73.4877893 -73.0239844
         stats
parameter         75%       97.5%
    theta   0.6010357   0.6030397
    lp__  -72.8310545 -72.7810623

, , chains = chain:2

         stats
parameter        mean          sd        2.5%         25%         50%
    theta   0.5998994 0.001571018   0.5968814   0.5988038   0.5999179
    lp__  -73.2631595 0.683538838 -75.0658158 -73.4058264 -73.0137665
         stats
parameter         75%       97.5%
    theta   0.6009871   0.6029137
    lp__  -72.8312244 -72.7813010

, , chains = chain:3

         stats
parameter        mean          sd        2.5%         25%         50%
    theta   0.5998817 0.001587269   0.5967369   0.5988433   0.5998721
    lp__  -73.2731992 0.701342742 -75.2317999 -73.4383469 -72.9963051
         stats
parameter         75%       97.5%
    theta   0.6009482   0.6029712
    lp__  -72.8272974 -72.7811285

, , chains = chain:4

         stats
parameter        mean          sd       2.5%         25%         50%
    theta   0.5999293 0.001620776   0.596818   0.5988242   0.5999311
    lp__  -73.2944772 0.734184152 -75.308599 -73.4610650 -73.0153321
         stats
parameter         75%       97.5%
    theta   0.6010125   0.6031474
    lp__  -72.8308732 -72.7809904
# Extraer muestras de theta 
muestras_theta2 <- extract(resultados_stan2)$theta

# Graficar las posteriores de theta
hist(muestras_theta2, breaks = 30, main = "Posterior de theta con Normal Transformada", xlab = "theta", ylab = "Densidad", col = "skyblue")

# Modelo c 

# Datos del año 2024
datos <- list(solicitados = 100, otorgados = 50)

# Distribución inicial del inciso (c) - Distribución Uniforme (0,1)
# No hay parámetros a especificar

# Combinar datos y parámetros iniciales en una lista para Stan
datos_stan3 <- list(solicitados = datos$solicitados, otorgados = datos$otorgados)

# Especifica la ruta del archivo Stan
ruta_archivo_stan3 <- "Ej1-modeloc.stan"

# Compilar el modelo Stan desde el archivo
modelo_stan3 <- stan_model(file = ruta_archivo_stan3)

# Ejecutar el modelo Stan
resultados_stan3 <- sampling(modelo_stan3, data = datos_stan3, chains = 4, iter = 10000)

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 6e-06 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.06 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 1: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 1: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 1: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 1: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 1: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 1: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 1: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 1: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 1: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 1: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 1: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.024 seconds (Warm-up)
Chain 1:                0.025 seconds (Sampling)
Chain 1:                0.049 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 1e-06 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.01 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 2: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 2: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 2: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 2: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 2: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 2: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 2: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 2: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 2: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 2: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 2: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.024 seconds (Warm-up)
Chain 2:                0.025 seconds (Sampling)
Chain 2:                0.049 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 0 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 3: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 3: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 3: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 3: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 3: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 3: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 3: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 3: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 3: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 3: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 3: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.024 seconds (Warm-up)
Chain 3:                0.025 seconds (Sampling)
Chain 3:                0.049 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 0 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 4: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 4: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 4: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 4: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 4: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 4: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 4: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 4: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 4: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 4: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 4: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.024 seconds (Warm-up)
Chain 4:                0.024 seconds (Sampling)
Chain 4:                0.048 seconds (Total)
Chain 4: 
# Resumen de los resultados
print(summary(resultados_stan3))
$summary
            mean      se_mean         sd        2.5%         25%         50%
theta   0.499545 0.0005974795 0.04931517   0.4025504   0.4663421   0.4996972
lp__  -71.204568 0.0070474395 0.71776763 -73.2821281 -71.3653693 -70.9285404
              75%       97.5%     n_eff     Rhat
theta   0.5330038   0.5958356  6812.633 1.000191
lp__  -70.7519200 -70.7015819 10373.015 1.000513

$c_summary
, , chains = chain:1

         stats
parameter        mean         sd        2.5%         25%         50%
    theta   0.4998134 0.04937365   0.4043552   0.4661258   0.4997827
    lp__  -71.2056950 0.72136797 -73.2794547 -71.3608275 -70.9338826
         stats
parameter         75%       97.5%
    theta   0.5335305   0.5948625
    lp__  -70.7517115 -70.7016349

, , chains = chain:2

         stats
parameter       mean         sd        2.5%         25%         50%         75%
    theta   0.498337 0.04890992   0.4024369   0.4654035   0.4969119   0.5314679
    lp__  -71.196688 0.70811197 -73.1852727 -71.3504298 -70.9266392 -70.7511517
         stats
parameter       97.5%
    theta   0.5961156
    lp__  -70.7015177

, , chains = chain:3

         stats
parameter        mean         sd        2.5%         25%         50%
    theta   0.4996124 0.05083224   0.3992545   0.4659888   0.5002172
    lp__  -71.2363706 0.75902185 -73.4831978 -71.4066412 -70.9361689
         stats
parameter         75%       97.5%
    theta   0.5337623   0.5983788
    lp__  -70.7545958 -70.7015887

, , chains = chain:4

         stats
parameter        mean         sd        2.5%        25%         50%         75%
    theta   0.5004172 0.04809626   0.4051361   0.467568   0.5018179   0.5331045
    lp__  -71.1795196 0.67930821 -73.0381466 -71.330156 -70.9198070 -70.7500391
         stats
parameter       97.5%
    theta   0.5931362
    lp__  -70.7016026
# Extraer muestras de theta 
muestras_theta3 <- extract(resultados_stan3)$theta

# Graficar las posteriores de theta
plot(density(muestras_theta3), main = expression(paste(theta, "  Unif(0,1)")), xlab = "theta", ylab = "Densidad de probabilidad", col = "blue")

Inciso e

# Cargar las muestras de theta obtenidas en los tres modelos
muestras_theta_a <- extract(resultados_stan1)$theta
muestras_theta_b <- extract(resultados_stan2)$theta
muestras_theta_c <- extract(resultados_stan3)$theta

# Calcular la media de las distribuciones posteriores de theta
theta_media_a <- mean(muestras_theta_a)
theta_media_b <- mean(muestras_theta_b)
theta_media_c <- mean(muestras_theta_c)

# Imprimir los resultados
cat("Estimación de la tasa de créditos otorgados usando las tres distribuciones finales:\n")
Estimación de la tasa de créditos otorgados usando las tres distribuciones finales:
cat("Inciso (a):", theta_media_a, "\n")
Inciso (a): 0.5598829 
cat("Inciso (b):", theta_media_b, "\n")
Inciso (b): 0.5999044 
cat("Inciso (c):", theta_media_c, "\n")
Inciso (c): 0.499545 

Inciso f

# Calcular el momio de otorgar un crédito utilizando las medias de las distribuciones finales phi = theta / ( 1 - theta ) 

# Calcular el momio de otorgar un crédito utilizando las medias de las distribuciones finales
phi_a <- theta_media_a / (1 - theta_media_a)
phi_b <- theta_media_b / (1 - theta_media_b)
phi_c <- theta_media_c / (1 - theta_media_c)

# Imprimir los resultados
cat("Momio de otorgar un crédito usando las medias de las distribuciones finales:\n")
Momio de otorgar un crédito usando las medias de las distribuciones finales:
cat("Inciso (a):", phi_a, "\n")
Inciso (a): 1.272123 
cat("Inciso (b):", phi_b, "\n")
Inciso (b): 1.499403 
cat("Inciso (c):", phi_c, "\n")
Inciso (c): 0.9981817 

Ejercicio 2:

Inciso a

# Datos de las utilidades mensuales ( # verosimilitud normal(mu, sigma**2))

utilidades <- c(212, 207, 210, 196, 223, 193, 196, 210, 202, 221)

N <- length(utilidades)
prior_mu_mean <- 200
prior_mu_var <- 40
prior_sigma_shape <- 10
prior_sigma_rate <- 1

# Combinar datos
datos_stan4 <- list(N = N, utilidades = utilidades, prior_mu_mean = prior_mu_mean,
              prior_mu_var = prior_mu_var, prior_sigma_shape = prior_sigma_shape,
              prior_sigma_rate = prior_sigma_rate)

# Especifica la ruta del archivo Stan
ruta_archivo_stan4 <- "Ej2-incisoa.stan"

# Compilar el modelo Stan desde el archivo
modelo_stan4 <- stan_model(file = ruta_archivo_stan4)

# Ejecutar el modelo Stan
resultados_stan4 <- sampling(modelo_stan4, data = datos_stan4, chains = 4, iter = 10000)

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 2.2e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.22 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 1: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 1: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 1: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 1: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 1: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 1: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 1: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 1: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 1: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 1: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 1: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.042 seconds (Warm-up)
Chain 1:                0.04 seconds (Sampling)
Chain 1:                0.082 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 2e-06 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.02 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 2: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 2: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 2: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 2: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 2: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 2: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 2: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 2: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 2: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 2: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 2: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.041 seconds (Warm-up)
Chain 2:                0.038 seconds (Sampling)
Chain 2:                0.079 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 1e-06 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.01 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 3: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 3: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 3: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 3: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 3: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 3: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 3: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 3: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 3: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 3: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 3: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.041 seconds (Warm-up)
Chain 3:                0.039 seconds (Sampling)
Chain 3:                0.08 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 1e-06 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.01 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 4: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 4: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 4: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 4: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 4: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 4: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 4: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 4: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 4: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 4: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 4: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.043 seconds (Warm-up)
Chain 4:                0.038 seconds (Sampling)
Chain 4:                0.081 seconds (Total)
Chain 4: 
# Resumen de los resultados
print(summary(resultados_stan4))
$summary
              mean    se_mean        sd      2.5%       25%       50%       75%
mu       206.58937 0.01141248 1.5150467 203.59573 205.57734 206.59619 207.60470
sigma_sq  25.07786 0.02891196 3.7621536  18.45737  22.44228  24.80871  27.44168
lp__     -29.61774 0.01038145 0.9873313 -32.24928 -29.99509 -29.31475 -28.91718
             97.5%     n_eff     Rhat
mu       209.54510 17623.492 1.000027
sigma_sq  33.11755 16932.377 1.000192
lp__     -28.65707  9045.028 1.000144

$c_summary
, , chains = chain:1

          stats
parameter       mean       sd      2.5%       25%       50%       75%     97.5%
  mu       206.58505 1.533864 203.58492 205.55961 206.60433 207.59984 209.58726
  sigma_sq  25.19974 3.764402  18.64220  22.59846  24.89956  27.53531  33.18217
  lp__     -29.62632 1.002601 -32.30337 -30.02783 -29.30814 -28.90827 -28.65584

, , chains = chain:2

          stats
parameter       mean        sd      2.5%       25%       50%       75%
  mu       206.57971 1.4887469 203.70879 205.58220 206.58556 207.57146
  sigma_sq  25.01310 3.7944075  18.17156  22.33619  24.79967  27.35597
  lp__     -29.61518 0.9782762 -32.29914 -29.98818 -29.31543 -28.91377
          stats
parameter      97.5%
  mu       209.49029
  sigma_sq  33.03935
  lp__     -28.65794

, , chains = chain:3

          stats
parameter       mean        sd      2.5%       25%       50%       75%
  mu       206.61926 1.5123890 203.67483 205.56780 206.60594 207.64032
  sigma_sq  25.01095 3.6820833  18.61904  22.40115  24.71958  27.41786
  lp__     -29.59839 0.9528304 -32.10497 -29.97532 -29.31344 -28.92564
          stats
parameter      97.5%
  mu       209.59236
  sigma_sq  32.93128
  lp__     -28.65900

, , chains = chain:4

          stats
parameter       mean       sd      2.5%       25%       50%       75%     97.5%
  mu       206.57346 1.524847 203.50972 205.59221 206.58998 207.59893 209.49150
  sigma_sq  25.08766 3.804481  18.38593  22.43717  24.78116  27.43791  33.20457
  lp__     -29.63105 1.014458 -32.31226 -29.99280 -29.32007 -28.92158 -28.65780
# Graficar la posterior de mu
mcmc_trace(resultados_stan4, pars = "mu")

# Graficar la posterior de sigma_sq
mcmc_trace(resultados_stan4, pars = "sigma_sq")

# Graficar la posterior conjunta de mu y sigma_sq
mcmc_scatter(resultados_stan4, pars = c("mu", "sigma_sq"))

# Extracción de los valores de los parámetros
parametros <- extract(resultados_stan4)
mu_values <- parametros$mu
sigma_sq_values <- parametros$sigma_sq

# Cálculo de las medias
mu_mean <- mean(mu_values)
sigma_sq_mean <- mean(sigma_sq_values)

# Impresión de los valores
cat("El valor de mu es:", mu_mean, "\n")
El valor de mu es: 206.5894 
cat("El valor de sigma^2 es:", sigma_sq_mean, "\n")
El valor de sigma^2 es: 25.07786 

Inciso b

# Utilizando inicial no informativa para la varianza

# Datos de utilidades mensuales
utilidades <- c(212, 207, 210, 196, 223, 193, 196, 210, 202, 221)
N <- length(utilidades)

# Combinar datos sin priors específicos
datos_stan5 <- list(N = N, utilidades = utilidades)

# Especifica la ruta del archivo Stan (ajustado para usar sigma_sq)
ruta_archivo_stan5 <- "Ej2-incisob.stan"

# Compilar y ejecutar el modelo Stan corregido
modelo_stan5 <- stan_model(file = ruta_archivo_stan5)
resultados_stan5 <- sampling(modelo_stan5, data = datos_stan5, chains = 4, iter = 10000)

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 2e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.2 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 1: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 1: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 1: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 1: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 1: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 1: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 1: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 1: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 1: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 1: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 1: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.044 seconds (Warm-up)
Chain 1:                0.046 seconds (Sampling)
Chain 1:                0.09 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 2e-06 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.02 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 2: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 2: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 2: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 2: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 2: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 2: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 2: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 2: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 2: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 2: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 2: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.041 seconds (Warm-up)
Chain 2:                0.048 seconds (Sampling)
Chain 2:                0.089 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 2e-06 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.02 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 3: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 3: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 3: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 3: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 3: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 3: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 3: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 3: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 3: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 3: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 3: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.041 seconds (Warm-up)
Chain 3:                0.046 seconds (Sampling)
Chain 3:                0.087 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 2e-06 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.02 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 4: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 4: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 4: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 4: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 4: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 4: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 4: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 4: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 4: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 4: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 4: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.043 seconds (Warm-up)
Chain 4:                0.036 seconds (Sampling)
Chain 4:                0.079 seconds (Total)
Chain 4: 
# Resumen de los resultados
print(summary(resultados_stan5))
$summary
             mean    se_mean        sd      2.5%       25%       50%       75%
mu       207.0057 0.03763518  3.690594 199.66754 204.74255 206.97898 209.28297
sigma_sq 135.1401 0.95432791 86.666495  49.93882  83.72312 113.94764 160.13667
lp__     -28.9007 0.01368885  1.101877 -31.86297 -29.31035 -28.56861 -28.12121
             97.5%    n_eff      Rhat
mu       214.45405 9616.221 0.9999943
sigma_sq 346.49443 8247.213 1.0001971
lp__     -27.84221 6479.363 1.0000766

$c_summary
, , chains = chain:1

          stats
parameter       mean        sd      2.5%       25%       50%       75%
  mu       206.95883  3.767067 199.34856 204.66046 206.92303 209.25183
  sigma_sq 136.04220 96.887322  50.05420  84.30949 113.84379 158.76778
  lp__     -28.91296  1.137032 -31.97877 -29.34305 -28.56059 -28.10505
          stats
parameter      97.5%
  mu       214.64139
  sigma_sq 349.11952
  lp__     -27.84329

, , chains = chain:2

          stats
parameter       mean        sd      2.5%       25%       50%       75%
  mu       207.03019  3.686229 199.62631 204.79351 207.03043 209.32669
  sigma_sq 135.30496 83.202092  51.19694  83.37687 113.58939 161.01209
  lp__     -28.90741  1.107897 -31.93801 -29.30793 -28.56861 -28.12401
          stats
parameter      97.5%
  mu       214.30488
  sigma_sq 353.13950
  lp__     -27.84101

, , chains = chain:3

          stats
parameter       mean        sd      2.5%       25%      50%       75%     97.5%
  mu       207.07533  3.632349 200.10771 204.78983 207.0283 209.33080 214.37594
  sigma_sq 135.38524 86.179246  48.66956  83.59309 114.7512 161.63194 345.93132
  lp__     -28.90289  1.094525 -31.80353 -29.31364 -28.5811 -28.13745 -27.84501

, , chains = chain:4

          stats
parameter       mean        sd      2.5%       25%       50%       75%
  mu       206.95841  3.675212 199.58881 204.71761 206.94238 209.23013
  sigma_sq 133.82805 79.433028  49.86485  83.36060 113.80612 159.25967
  lp__     -28.87952  1.066935 -31.78863 -29.28854 -28.55998 -28.12236
          stats
parameter      97.5%
  mu       214.46867
  sigma_sq 329.97595
  lp__     -27.84049
# Extracción de los valores de los parámetros del modelo ajustado
parametros_ajustados <- extract(resultados_stan5)
mu_values_ajustados <- parametros_ajustados$mu
sigma_sq_values_ajustados <- parametros_ajustados$sigma_sq

# Cálculo de las medias para el modelo ajustado
mu_mean_ajustado <- mean(mu_values_ajustados)
sigma_sq_mean_ajustado <- mean(sigma_sq_values_ajustados)

# Impresión de los valores para el modelo ajustado
cat("El valor de mu es:", mu_mean_ajustado, "\n")
El valor de mu es: 207.0057 
cat("El valor de sigma^2 es:", sigma_sq_mean_ajustado, "\n")
El valor de sigma^2 es: 135.1401 

Ejercicio 3:

Paso 1: Datos

calificaciones <- read.table("data/calificaciones.txt", header = TRUE, sep = "")

Paso 2 : Ver datos

# Gráfica de dispersión con curva de regresión
ggplot(calificaciones, aes(x = MO, y = SP)) +
  geom_point() +  
  geom_smooth(method = "lm", col = "red") +  
  labs(title = "Relación entre Calificaciones de Moody’s y S&P",
       x = "Calificaciones de Moody’s", y = "Calificaciones de S&P") +
  theme_minimal()  
`geom_smooth()` using formula = 'y ~ x'

Paso 3: Modelo Stan

datos_stan_calif <- list(N = nrow(calificaciones),
                   x = calificaciones$MO,
                   y = calificaciones$SP)


ruta_modelo_calificaciones <- "Ej3-modelo.stan"

# Compilar y ejecutar el modelo Stan
modelo_stan_calif <- stan_model(file = ruta_modelo_calificaciones)
resultados_stan_calif <- sampling(modelo_stan_calif, data = datos_stan_calif, chains = 4, iter = 10000, warmup = 1000)

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 3.3e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.33 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 1: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 1: Iteration: 1001 / 10000 [ 10%]  (Sampling)
Chain 1: Iteration: 2000 / 10000 [ 20%]  (Sampling)
Chain 1: Iteration: 3000 / 10000 [ 30%]  (Sampling)
Chain 1: Iteration: 4000 / 10000 [ 40%]  (Sampling)
Chain 1: Iteration: 5000 / 10000 [ 50%]  (Sampling)
Chain 1: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 1: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 1: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 1: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 1: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.063 seconds (Warm-up)
Chain 1:                0.599 seconds (Sampling)
Chain 1:                0.662 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 3e-06 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.03 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 2: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 2: Iteration: 1001 / 10000 [ 10%]  (Sampling)
Chain 2: Iteration: 2000 / 10000 [ 20%]  (Sampling)
Chain 2: Iteration: 3000 / 10000 [ 30%]  (Sampling)
Chain 2: Iteration: 4000 / 10000 [ 40%]  (Sampling)
Chain 2: Iteration: 5000 / 10000 [ 50%]  (Sampling)
Chain 2: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 2: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 2: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 2: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 2: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.067 seconds (Warm-up)
Chain 2:                0.657 seconds (Sampling)
Chain 2:                0.724 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 5e-06 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.05 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 3: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 3: Iteration: 1001 / 10000 [ 10%]  (Sampling)
Chain 3: Iteration: 2000 / 10000 [ 20%]  (Sampling)
Chain 3: Iteration: 3000 / 10000 [ 30%]  (Sampling)
Chain 3: Iteration: 4000 / 10000 [ 40%]  (Sampling)
Chain 3: Iteration: 5000 / 10000 [ 50%]  (Sampling)
Chain 3: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 3: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 3: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 3: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 3: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.07 seconds (Warm-up)
Chain 3:                0.554 seconds (Sampling)
Chain 3:                0.624 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 4e-06 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.04 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 4: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 4: Iteration: 1001 / 10000 [ 10%]  (Sampling)
Chain 4: Iteration: 2000 / 10000 [ 20%]  (Sampling)
Chain 4: Iteration: 3000 / 10000 [ 30%]  (Sampling)
Chain 4: Iteration: 4000 / 10000 [ 40%]  (Sampling)
Chain 4: Iteration: 5000 / 10000 [ 50%]  (Sampling)
Chain 4: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 4: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 4: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 4: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 4: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.065 seconds (Warm-up)
Chain 4:                0.668 seconds (Sampling)
Chain 4:                0.733 seconds (Total)
Chain 4: 
print(resultados_stan_calif)
Inference for Stan model: anon_model.
4 chains, each with iter=10000; warmup=1000; thin=1; 
post-warmup draws per chain=9000, total post-warmup draws=36000.

       mean se_mean   sd  2.5%   25%   50%   75% 97.5% n_eff Rhat
alpha -1.69    0.01 0.78 -3.24 -2.21 -1.69 -1.19 -0.16  9855    1
beta   0.84    0.00 0.16  0.53  0.74  0.84  0.94  1.15  9810    1
sigma  0.47    0.00 0.09  0.34  0.41  0.46  0.52  0.67 12572    1
lp__   5.20    0.01 1.31  1.79  4.61  5.54  6.15  6.69  9274    1

Samples were drawn using NUTS(diag_e) at Sun Mar 17 13:32:27 2024.
For each parameter, n_eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor on split chains (at 
convergence, Rhat=1).
summary(resultados_stan_calif)
$summary
            mean      se_mean         sd       2.5%        25%        50%
alpha -1.6943932 0.0078868179 0.78292550 -3.2415869 -2.2070529 -1.6905531
beta   0.8387735 0.0015672368 0.15523125  0.5329658  0.7378265  0.8383906
sigma  0.4683418 0.0007621465 0.08545492  0.3351405  0.4074590  0.4576057
lp__   5.1985083 0.0136434596 1.31387509  1.7942519  4.6123226  5.5361051
             75%      97.5%     n_eff     Rhat
alpha -1.1864690 -0.1550034  9854.560 1.000072
beta   0.9401311  1.1456687  9810.453 1.000052
sigma  0.5159615  0.6667653 12571.793 1.000148
lp__   6.1505404  6.6895371  9273.830 1.000620

$c_summary
, , chains = chain:1

         stats
parameter       mean         sd       2.5%        25%        50%        75%
    alpha -1.6944900 0.77048711 -3.2243775 -2.2042787 -1.6860629 -1.1700933
    beta   0.8385956 0.15270028  0.5466828  0.7343877  0.8379365  0.9388276
    sigma  0.4674214 0.08458237  0.3337725  0.4078786  0.4577880  0.5157258
    lp__   5.2007596 1.31075804  1.7914751  4.6150289  5.5404856  6.1432890
         stats
parameter      97.5%
    alpha -0.2360018
    beta   1.1413636
    sigma  0.6616661
    lp__   6.6887697

, , chains = chain:2

         stats
parameter       mean        sd       2.5%        25%        50%        75%
    alpha -1.7033840 0.7786718 -3.2290929 -2.2129776 -1.7061346 -1.2081469
    beta   0.8406582 0.1545700  0.5268109  0.7425579  0.8408845  0.9429514
    sigma  0.4688428 0.0846442  0.3347941  0.4082883  0.4589355  0.5167946
    lp__   5.2279670 1.2948281  1.8856178  4.6578586  5.5609634  6.1639931
         stats
parameter      97.5%
    alpha -0.1263879
    beta   1.1456729
    sigma  0.6654307
    lp__   6.6888366

, , chains = chain:3

         stats
parameter       mean         sd       2.5%        25%        50%        75%
    alpha -1.6887255 0.80532434 -3.2617026 -2.2222136 -1.6864496 -1.1743004
    beta   0.8380354 0.15925291  0.5258965  0.7360263  0.8380061  0.9428287
    sigma  0.4684143 0.08526311  0.3350406  0.4073257  0.4576315  0.5150687
    lp__   5.1541205 1.33801376  1.6616202  4.5440349  5.4994034  6.1405086
         stats
parameter       97.5%
    alpha -0.09619505
    beta   1.14824944
    sigma  0.66858330
    lp__   6.68718619

, , chains = chain:4

         stats
parameter       mean         sd       2.5%        25%        50%        75%
    alpha -1.6909732 0.77681509 -3.2476584 -2.1809000 -1.6810615 -1.1927843
    beta   0.8378047 0.15433464  0.5359377  0.7379582  0.8362941  0.9359853
    sigma  0.4686885 0.08730847  0.3366558  0.4061326  0.4555864  0.5163916
    lp__   5.2111860 1.31060863  1.8419849  4.6293295  5.5445768  6.1509453
         stats
parameter      97.5%
    alpha -0.1752109
    beta   1.1466815
    sigma  0.6726441
    lp__   6.6924969

Paso 4: Gráficas e Interpretación

# Graficar la traza de las muestras para verificar la convergencia
mcmc_trace(resultados_stan_calif, pars = c("alpha", "beta", "sigma"))

# Graficar las densidades posteriores para las estimaciones de los parámetros
mcmc_dens(resultados_stan_calif, pars = c("alpha", "beta", "sigma"))

# Graficar los intervalos de credibilidad para los parámetros
mcmc_intervals(resultados_stan_calif, pars = c("alpha", "beta", "sigma"))

# Graficar la distribución posterior conjunta para 'alpha', 'beta', y 'sigma'
mcmc_pairs(resultados_stan_calif, pars = c("alpha", "beta", "sigma"))

Interpretación de los parámetros:

  • Alpha: La estimación para la intersección (alpha) es de aproximadamente -1.70, con un intervalo de credibilidad del 95% entre -3.26 y -0.15. Esto sugiere que, en promedio, cuando la calificación de Moody’s es cero, la calificación de S&P tiende a estar entre -3.26 y -0.15. La desviación estándar asociada a esta estimación es de aproximadamente 0.78, lo cual muestra la variabilidad e incertidumbre de la estimación.

  • Beta: La estimación para la pendiente (beta) es de aproximadamente 0.84, con un intervalo de credibilidad del 95% entre 0.54 y 1.15. Esto indica que, en promedio, por cada aumento unitario en la calificación de Moody’s, la calificación de S&P tiende a aumentar entre 0.54 y 1.15 unidades. La desviación estándar asociada a esta estimación es de aproximadamente 0.15.

  • Sigma: La desviación estándar del término de error (sigma) tiene una estimación de aproximadamente 0.47, con un intervalo de credibilidad del 95% entre 0.34 y 0.66. Esto indica la dispersión de los errores en las predicciones del modelo. La desviación estándar asociada a esta estimación es de aproximadamente 0.08.

Diagnósticos de convergencia:

Los diagnósticos de convergencia (n_eff y Rhat) muestran valores adecuados para todos los parámetros, lo que indica una buena convergencia del modelo.

En resumen, este análisis bayesiano proporciona estimaciones precisas para los parámetros del modelo de regresión lineal, lo que nos permite entender la relación entre las calificaciones de Moody’s y S&P en el contexto de las calificaciones de 20 empresas financieras. Los resultados respaldan la idea inicial de una asociación estadísticamente significativa y positiva entre las calificaciones de ambas compañías calificadoras.

Predicciones posteriores:

# Generar predicciones posteriores manualmente porque no me funciona posterior_predict()

# Convertir el objeto stanfit a un data frame
muestras_df <- as.data.frame(extract(resultados_stan_calif))

# Definir el número de muestras y el número de observaciones en los datos
num_muestras <- nrow(muestras_df)
num_observaciones <- datos_stan_calif$N

# Inicializar un vector para almacenar las predicciones posteriores
predicciones_posteriores <- numeric(length = num_muestras * num_observaciones)

# Calcular las predicciones posteriores para cada muestra
for (i in 1:num_muestras) {
  alpha <- muestras_df[i, "alpha"]
  beta <- muestras_df[i, "beta"]
  sigma <- muestras_df[i, "sigma"]
  
  # Calcular las predicciones posteriores usando los parámetros muestreados
  predicciones <- rnorm(n = num_observaciones, mean = alpha + beta * datos_stan_calif$x, sd = sigma)
  
  # Almacenar las predicciones en el vector predicciones_posteriores
  predicciones_posteriores[((i - 1) * num_observaciones + 1):(i * num_observaciones)] <- predicciones
}

# Crear un histograma de las predicciones posteriores
hist(predicciones_posteriores, main = "Predicciones Posteriores", xlab = "Calificaciones de S&P", ylab = "Frecuencia", col = "lightblue")

# Agregar unas líneas vertical para los datos observados
abline(v = datos_stan_calif$y, col = "red", lwd = 2)

samples <- extract(resultados_stan_calif)

n_obs <- length(calificaciones$SP)  # Número de observaciones
n_samples <- dim(samples$alpha)[1]  # Número de muestras en la cadena MCMC
yrep <- matrix(NA, nrow = n_obs, ncol = n_samples)  # Matriz para almacenar predicciones

for (i in 1:n_samples) {
  yrep[, i] <- samples$alpha[i] +
               samples$beta[i] * calificaciones$MO
}

predicciones_media <- apply(yrep, 1, mean)

if(length(calificaciones$SP) == length(predicciones_media)) {
  # Primero crea el gráfico de dispersión
  plot(calificaciones$SP, predicciones_media,
       xlab = "Valores Observados",
       ylab = "Media de Valores Predichos",
       main = "PPC: Valores Observados vs. Media de Valores Predichos")
  
  # Luego, añade la línea roja usando abline()
  abline(a = 0, b = 1, col = "red")
} else {
  stop("La longitud de los datos observados y las predicciones no coincide.")
}

El gráfico ilustra la comparación entre las calificaciones observadas y las predicciones generadas por el modelo. Se observan varias desviaciones entre los datos reales y las predicciones del modelo, lo que indica que existen áreas donde el modelo podría ser mejorado para lograr una mayor precisión en sus predicciones.

Ejercicio 4:

Parte 1: Datos

# Y = salario en miles de usd
# X1 = el índice de calidad de trabajo
# X2 = número de años de experiencia 
# X3 = índice de éxito en publicaciones

salarios <- read.table("data/salarios.txt", header = TRUE, sep = "", dec = ".")


# Gráfico de dispersión de Y vs X1
ggplot(salarios, aes(x = X1, y = Y)) + 
  geom_point() + 
  labs(title = "Gráfico de dispersión de Salarios vs Calidad de Trabajo", x = "Índice de Calidad de Trabajo (X1)", y = "Salario Anual (Y)")

# Gráfico de dispersión de Y vs X2
ggplot(salarios, aes(x = X2, y = Y)) + 
  geom_point() + 
  labs(title = "Gráfico de dispersión de Salarios vs Años de Experiencia", x = "Número de Años de Experiencia (X2)", y = "Salario Anual (Y)")

# Gráfico de dispersión de Y vs X3
ggplot(salarios, aes(x = X3, y = Y)) + 
  geom_point() + 
  labs(title = "Gráfico de dispersión de Salarios vs Índice de Éxito en Publicaciones", x = "Índice de Éxito en Publicaciones (X3)", y = "Salario Anual (Y)")

Parte 2: Modelo

datos_salarios <- list(N = nrow(salarios),
                   X1 = salarios$X1,
                   X2 = salarios$X2,
                   X3 = salarios$X3,
                   Y = salarios$Y,
                   N_new = 5,
                   X1_new = c(5, 4, 17, 6, 0),
                   X2_new = c(6, 2, 12, 5, 8),
                   X3_new = c(6, 4, 21, 6, 1))

modelo_salarios <- stan_model(file = "Ej4-modelo.stan")

ajuste_salarios <- sampling(modelo_salarios, data = datos_salarios, iter = 10000, chains = 4)

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 4.6e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.46 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 1: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 1: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 1: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 1: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 1: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 1: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 1: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 1: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 1: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 1: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 1: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.557 seconds (Warm-up)
Chain 1:                0.644 seconds (Sampling)
Chain 1:                1.201 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 1.1e-05 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.11 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 2: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 2: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 2: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 2: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 2: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 2: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 2: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 2: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 2: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 2: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 2: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.566 seconds (Warm-up)
Chain 2:                0.615 seconds (Sampling)
Chain 2:                1.181 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 5e-06 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.05 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 3: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 3: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 3: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 3: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 3: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 3: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 3: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 3: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 3: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 3: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 3: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.62 seconds (Warm-up)
Chain 3:                0.698 seconds (Sampling)
Chain 3:                1.318 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 7e-06 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.07 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 4: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 4: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 4: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 4: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 4: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 4: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 4: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 4: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 4: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 4: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 4: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.607 seconds (Warm-up)
Chain 4:                0.603 seconds (Sampling)
Chain 4:                1.21 seconds (Total)
Chain 4: 
print(ajuste_salarios, pars = c("alpha", "beta1", "beta2", "beta3", "sigma"))
Inference for Stan model: anon_model.
4 chains, each with iter=10000; warmup=5000; thin=1; 
post-warmup draws per chain=5000, total post-warmup draws=20000.

       mean se_mean   sd  2.5%   25%   50%   75% 97.5% n_eff Rhat
alpha 17.11    0.02 2.09 12.87 15.75 17.15 18.48 21.09  9766    1
beta1  1.16    0.00 0.35  0.48  0.94  1.16  1.39  1.86 10711    1
beta2  0.32    0.00 0.04  0.24  0.30  0.32  0.35  0.40 12307    1
beta3  1.36    0.00 0.31  0.75  1.15  1.35  1.56  1.99 11048    1
sigma  1.84    0.00 0.31  1.35  1.62  1.80  2.01  2.53 10956    1

Samples were drawn using NUTS(diag_e) at Sun Mar 17 13:33:12 2024.
For each parameter, n_eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor on split chains (at 
convergence, Rhat=1).
predicciones_salarios <- extract(ajuste_salarios)$salary_pred

Parte 3: Interpretación y Gráficas

  • alpha (Intercepto): La media de la distribución posterior es 17.08, lo que sugiere que, manteniendo todas las demás variables en cero, el salario anual promedio en miles de dólares sería de aproximadamente 17,080 USD.

  • beta1 (Coeficiente para X1 - índice de calidad de trabajo): La media de 1.17 indica que por cada unidad que aumenta el índice de calidad de trabajo, esperaríamos que el salario anual promedio aumente en 1,170 USD, manteniendo constantes las demás variables.

  • beta2 (Coeficiente para X2 - número de años de experiencia): La media de 0.32 sugiere un aumento esperado en el salario anual de 320 USD por cada año adicional de experiencia, ceteris paribus.

  • beta3 (Coeficiente para X3 - índice de éxito en publicaciones): Con una media de 1.36, se interpreta que por cada punto adicional en el índice de éxito en publicaciones, se espera un incremento de 1,360 USD en el salario anual, manteniendo las demás variables constantes.

  • sigma (Desviación estándar del error): La media de la distribución posterior es 1.84, lo que indica que la desviación estándar de los errores alrededor de la línea de regresión es de aproximadamente 1,840 USD.

  • Los valores de n_eff son altos y los valores de Rhat son 1, lo que indica que las muestras son informativas y que las cadenas han convergido bien.

mcmc_hist(ajuste_salarios, pars = c("alpha", "beta1", "beta2", "beta3", "sigma"))
`stat_bin()` using `bins = 30`. Pick better value with `binwidth`.

ggplot(salarios, aes(x = X1, y = Y)) + 
  geom_point() + 
  geom_smooth(method = "lm", formula = y ~ x, se = TRUE) +
  labs(title = "Gráfico de dispersión de Salarios vs Calidad de Trabajo con Línea de Regresión",
       x = "Índice de Calidad de Trabajo (X1)", y = "Salario Anual (Y)")

ggplot(salarios, aes(x = X2, y = Y)) + 
  geom_point() + 
  geom_smooth(method = "lm", formula = y ~ x, se = TRUE) +
  labs(title = "Gráfico de dispersión de Salarios vs Años de Experiencia",
       x = "Añps de Experiencia (X2)", y = "Salario Anual (Y)")

ggplot(salarios, aes(x = X3, y = Y)) + 
  geom_point() + 
  geom_smooth(method = "lm", formula = y ~ x, se = TRUE) +
  labs(title = "Gráfico de dispersión de Salarios vs Índice de Éxito en Publicaciones con Línea de Regresión",
       x = "Índice de Éxito en Publicaciones (X3)", y = "Salario Anual (Y)")

mcmc_dens_overlay(ajuste_salarios)

# Chequeos predictivos posteriores (PPC) 

# Generar predicciones posteriores manualmente porque no me funciona posterior_predict()

# Extraer las muestras de los parámetros del modelo
samples <- extract(ajuste_salarios)

# Asumiendo que tus nombres de parámetros son 'alpha', 'beta1', 'beta2', 'beta3' y 'sigma'
# Genera predicciones posteriores para cada observación
n_obs <- length(salarios$Y)  # Número de observaciones
n_samples <- dim(samples$alpha)[1]  # Número de muestras en la cadena MCMC
yrep <- matrix(NA, nrow = n_obs, ncol = n_samples)  # Matriz para almacenar predicciones

for (i in 1:n_samples) {
  yrep[, i] <- samples$alpha[i] +
               samples$beta1[i] * salarios$X1 +
               samples$beta2[i] * salarios$X2 +
               samples$beta3[i] * salarios$X3
}

# Ahora 'yrep' contiene las predicciones posteriores

# Calcular la media de las predicciones
predicciones_media <- apply(yrep, 1, mean)

# Asegúrate de que 'salarios$Y' y 'predicciones_media' tienen la misma longitud
if(length(salarios$Y) == length(predicciones_media)) {
  # Primero crea el gráfico de dispersión
  plot(salarios$Y, predicciones_media,
       xlab = "Valores Observados",
       ylab = "Media de Valores Predichos",
       main = "PPC: Valores Observados vs. Media de Valores Predichos")
  
  # Luego, añade la línea roja usando abline()
  abline(a = 0, b = 1, col = "red")
} else {
  stop("La longitud de los datos observados y las predicciones no coincide.")
}

El gráfico muestra cómo se comparan los salarios observados con las predicciones del modelo. La mayoría de los puntos están cerca de la línea roja, lo que indica una buena concordancia entre los valores observados y predichos. Sin embargo, hay algunas desviaciones, lo que sugiere que el modelo podría mejorarse en ciertas áreas.

Ejercicio 5:

Parte 1:

Datos y modelo logit binomial :
# pii = probabilidad de muerte
# xi = tiempo de exposición al mineral
# yi = número de muertes 
# varianza = 1/precisión 

# Como vimos en clase, en este ejercicio b0 y b1 estan en terminos de precision y no de varianza por lo que bi~ N(mu = 0, precisión = 0.001) -> en términos de varianza sería bi~ N(mu=0, 1000 )

# La relación entre la probabilidad de muerte y el tiempo de exposición se modela a través de la función logit que es el logaritmo de las odds (probabilidad de que ocurra el evento dividido por la probabilidad de que no ocurra): logit(pii) = log( pii / 1 - pii ) = b0 + b1*xi

# Leer los datos
datos_mortality <- read.table("data/mortality.txt", header = TRUE, sep = "")

# Prepara los datos para Stan, incluyendo nuevos datos para predicción
stan_data_mortality <- list(
  N = nrow(datos_mortality),
  y = datos_mortality$y,
  n = datos_mortality$n,
  x = as.vector(datos_mortality$x),
  new_N = 100, 
  new_x = 200, 
  new_n = rep(1, 100)
)

# Ajusta el modelo con los datos
fit_mortality <- stan(file = "Ej5-modelo1.stan", data = stan_data_mortality, iter = 10000, chains = 4)

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 2.7e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.27 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 1: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 1: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 1: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 1: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 1: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 1: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 1: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 1: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 1: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 1: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 1: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.112 seconds (Warm-up)
Chain 1:                0.103 seconds (Sampling)
Chain 1:                0.215 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 3e-06 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.03 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 2: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 2: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 2: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 2: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 2: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 2: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 2: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 2: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 2: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 2: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 2: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.125 seconds (Warm-up)
Chain 2:                0.097 seconds (Sampling)
Chain 2:                0.222 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 2e-06 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.02 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 3: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 3: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 3: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 3: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 3: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 3: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 3: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 3: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 3: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 3: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 3: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.104 seconds (Warm-up)
Chain 3:                0.103 seconds (Sampling)
Chain 3:                0.207 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 3e-06 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.03 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 4: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 4: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 4: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 4: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 4: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 4: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 4: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 4: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 4: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 4: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 4: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.116 seconds (Warm-up)
Chain 4:                0.097 seconds (Sampling)
Chain 4:                0.213 seconds (Total)
Chain 4: 
# Ver un resumen de los resultados
print(fit_mortality)
Inference for Stan model: anon_model.
4 chains, each with iter=10000; warmup=5000; thin=1; 
post-warmup draws per chain=5000, total post-warmup draws=20000.

               mean se_mean   sd    2.5%     25%     50%     75%   97.5% n_eff
alpha         -3.59    0.00 0.21   -4.02   -3.73   -3.58   -3.44   -3.19  5751
beta           0.01    0.00 0.00    0.01    0.01    0.01    0.01    0.01  6974
y_pred[1]      0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19296
y_pred[2]      0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20360
y_pred[3]      0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19765
y_pred[4]      0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19427
y_pred[5]      0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20050
y_pred[6]      0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20491
y_pred[7]      0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19710
y_pred[8]      0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19466
y_pred[9]      0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19835
y_pred[10]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19497
y_pred[11]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19994
y_pred[12]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19750
y_pred[13]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 18696
y_pred[14]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19791
y_pred[15]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19998
y_pred[16]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20687
y_pred[17]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20117
y_pred[18]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20465
y_pred[19]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19458
y_pred[20]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19848
y_pred[21]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19882
y_pred[22]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19218
y_pred[23]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19694
y_pred[24]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20422
y_pred[25]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20345
y_pred[26]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 18845
y_pred[27]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19571
y_pred[28]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20136
y_pred[29]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19200
y_pred[30]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19940
y_pred[31]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19224
y_pred[32]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20433
y_pred[33]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19998
y_pred[34]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 18866
y_pred[35]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19428
y_pred[36]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 20663
y_pred[37]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19763
y_pred[38]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 18746
y_pred[39]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20221
y_pred[40]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19816
y_pred[41]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19417
y_pred[42]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 20117
y_pred[43]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19742
y_pred[44]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19996
y_pred[45]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 18936
y_pred[46]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20742
y_pred[47]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19817
y_pred[48]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19481
y_pred[49]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19442
y_pred[50]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20189
y_pred[51]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19779
y_pred[52]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20040
y_pred[53]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19680
y_pred[54]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20330
y_pred[55]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 20108
y_pred[56]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19863
y_pred[57]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19738
y_pred[58]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19415
y_pred[59]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19556
y_pred[60]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20009
y_pred[61]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19881
y_pred[62]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19975
y_pred[63]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 20108
y_pred[64]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19905
y_pred[65]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19592
y_pred[66]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19453
y_pred[67]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20336
y_pred[68]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 20192
y_pred[69]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19899
y_pred[70]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19581
y_pred[71]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19772
y_pred[72]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19783
y_pred[73]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20388
y_pred[74]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19021
y_pred[75]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19686
y_pred[76]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20304
y_pred[77]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19836
y_pred[78]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 18803
y_pred[79]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 18920
y_pred[80]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19925
y_pred[81]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19359
y_pred[82]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19716
y_pred[83]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 18928
y_pred[84]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20543
y_pred[85]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19772
y_pred[86]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20092
y_pred[87]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20408
y_pred[88]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20215
y_pred[89]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20412
y_pred[90]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20349
y_pred[91]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19321
y_pred[92]     0.22    0.00 0.41    0.00    0.00    0.00    0.00    1.00 19668
y_pred[93]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20096
y_pred[94]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20289
y_pred[95]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19414
y_pred[96]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 18347
y_pred[97]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20628
y_pred[98]     0.23    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19708
y_pred[99]     0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 19711
y_pred[100]    0.22    0.00 0.42    0.00    0.00    0.00    0.00    1.00 20377
lp__        -161.08    0.01 0.98 -163.70 -161.46 -160.78 -160.38 -160.11  7559
            Rhat
alpha          1
beta           1
y_pred[1]      1
y_pred[2]      1
y_pred[3]      1
y_pred[4]      1
y_pred[5]      1
y_pred[6]      1
y_pred[7]      1
y_pred[8]      1
y_pred[9]      1
y_pred[10]     1
y_pred[11]     1
y_pred[12]     1
y_pred[13]     1
y_pred[14]     1
y_pred[15]     1
y_pred[16]     1
y_pred[17]     1
y_pred[18]     1
y_pred[19]     1
y_pred[20]     1
y_pred[21]     1
y_pred[22]     1
y_pred[23]     1
y_pred[24]     1
y_pred[25]     1
y_pred[26]     1
y_pred[27]     1
y_pred[28]     1
y_pred[29]     1
y_pred[30]     1
y_pred[31]     1
y_pred[32]     1
y_pred[33]     1
y_pred[34]     1
y_pred[35]     1
y_pred[36]     1
y_pred[37]     1
y_pred[38]     1
y_pred[39]     1
y_pred[40]     1
y_pred[41]     1
y_pred[42]     1
y_pred[43]     1
y_pred[44]     1
y_pred[45]     1
y_pred[46]     1
y_pred[47]     1
y_pred[48]     1
y_pred[49]     1
y_pred[50]     1
y_pred[51]     1
y_pred[52]     1
y_pred[53]     1
y_pred[54]     1
y_pred[55]     1
y_pred[56]     1
y_pred[57]     1
y_pred[58]     1
y_pred[59]     1
y_pred[60]     1
y_pred[61]     1
y_pred[62]     1
y_pred[63]     1
y_pred[64]     1
y_pred[65]     1
y_pred[66]     1
y_pred[67]     1
y_pred[68]     1
y_pred[69]     1
y_pred[70]     1
y_pred[71]     1
y_pred[72]     1
y_pred[73]     1
y_pred[74]     1
y_pred[75]     1
y_pred[76]     1
y_pred[77]     1
y_pred[78]     1
y_pred[79]     1
y_pred[80]     1
y_pred[81]     1
y_pred[82]     1
y_pred[83]     1
y_pred[84]     1
y_pred[85]     1
y_pred[86]     1
y_pred[87]     1
y_pred[88]     1
y_pred[89]     1
y_pred[90]     1
y_pred[91]     1
y_pred[92]     1
y_pred[93]     1
y_pred[94]     1
y_pred[95]     1
y_pred[96]     1
y_pred[97]     1
y_pred[98]     1
y_pred[99]     1
y_pred[100]    1
lp__           1

Samples were drawn using NUTS(diag_e) at Sun Mar 17 13:33:44 2024.
For each parameter, n_eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor on split chains (at 
convergence, Rhat=1).
summary(fit_mortality)
$summary
                   mean      se_mean          sd          2.5%           25%
alpha         -3.587062 2.790530e-03 0.211618651   -4.01755101   -3.72538157
beta           0.011642 1.777302e-05 0.001484183    0.00873104    0.01064694
y_pred[1]      0.226550 3.013499e-03 0.418609435    0.00000000    0.00000000
y_pred[2]      0.222700 2.915931e-03 0.416068943    0.00000000    0.00000000
y_pred[3]      0.221850 2.955422e-03 0.415501155    0.00000000    0.00000000
y_pred[4]      0.221750 2.980554e-03 0.415434191    0.00000000    0.00000000
y_pred[5]      0.221700 2.933649e-03 0.415400696    0.00000000    0.00000000
y_pred[6]      0.222550 2.905887e-03 0.415968928    0.00000000    0.00000000
y_pred[7]      0.225550 2.977073e-03 0.417954461    0.00000000    0.00000000
y_pred[8]      0.222200 2.979732e-03 0.415735254    0.00000000    0.00000000
y_pred[9]      0.227050 2.974620e-03 0.418935643    0.00000000    0.00000000
y_pred[10]     0.223600 2.984031e-03 0.416667398    0.00000000    0.00000000
y_pred[11]     0.220350 2.931343e-03 0.414493025    0.00000000    0.00000000
y_pred[12]     0.225200 2.972360e-03 0.417724412    0.00000000    0.00000000
y_pred[13]     0.222800 3.043370e-03 0.416135577    0.00000000    0.00000000
y_pred[14]     0.222750 2.957787e-03 0.416102265    0.00000000    0.00000000
y_pred[15]     0.219150 2.925286e-03 0.413680836    0.00000000    0.00000000
y_pred[16]     0.224000 2.898764e-03 0.416932479    0.00000000    0.00000000
y_pred[17]     0.224250 2.940743e-03 0.417097873    0.00000000    0.00000000
y_pred[18]     0.226000 2.923647e-03 0.418249622    0.00000000    0.00000000
y_pred[19]     0.222050 2.979645e-03 0.415634978    0.00000000    0.00000000
y_pred[20]     0.221300 2.946665e-03 0.415132421    0.00000000    0.00000000
y_pred[21]     0.226150 2.966899e-03 0.418347856    0.00000000    0.00000000
y_pred[22]     0.224800 3.011337e-03 0.417460985    0.00000000    0.00000000
y_pred[23]     0.225300 2.977107e-03 0.417790184    0.00000000    0.00000000
y_pred[24]     0.221900 2.907789e-03 0.415534624    0.00000000    0.00000000
y_pred[25]     0.223700 2.921647e-03 0.416733720    0.00000000    0.00000000
y_pred[26]     0.223600 3.035247e-03 0.416667398    0.00000000    0.00000000
y_pred[27]     0.219900 2.960648e-03 0.414189048    0.00000000    0.00000000
y_pred[28]     0.222250 2.929952e-03 0.415768662    0.00000000    0.00000000
y_pred[29]     0.226950 3.022948e-03 0.418870469    0.00000000    0.00000000
y_pred[30]     0.222850 2.947180e-03 0.416168881    0.00000000    0.00000000
y_pred[31]     0.223600 3.005152e-03 0.416667398    0.00000000    0.00000000
y_pred[32]     0.226300 2.927336e-03 0.418446012    0.00000000    0.00000000
y_pred[33]     0.225850 2.956940e-03 0.418151312    0.00000000    0.00000000
y_pred[34]     0.222100 3.026303e-03 0.415668412    0.00000000    0.00000000
y_pred[35]     0.220750 2.975648e-03 0.414762630    0.00000000    0.00000000
y_pred[36]     0.220700 2.885168e-03 0.414728960    0.00000000    0.00000000
y_pred[37]     0.225250 2.971664e-03 0.417757302    0.00000000    0.00000000
y_pred[38]     0.221600 3.033517e-03 0.415333679    0.00000000    0.00000000
y_pred[39]     0.227150 2.946549e-03 0.419000782    0.00000000    0.00000000
y_pred[40]     0.222600 2.955187e-03 0.416002275    0.00000000    0.00000000
y_pred[41]     0.224250 2.993309e-03 0.417097873    0.00000000    0.00000000
y_pred[42]     0.219950 2.920504e-03 0.414222859    0.00000000    0.00000000
y_pred[43]     0.225050 2.972302e-03 0.417625691    0.00000000    0.00000000
y_pred[44]     0.224750 2.951933e-03 0.417428018    0.00000000    0.00000000
y_pred[45]     0.225800 3.038499e-03 0.418118525    0.00000000    0.00000000
y_pred[46]     0.222800 2.889392e-03 0.416135577    0.00000000    0.00000000
y_pred[47]     0.224150 2.962428e-03 0.417031741    0.00000000    0.00000000
y_pred[48]     0.218900 2.962691e-03 0.413510991    0.00000000    0.00000000
y_pred[49]     0.224800 2.993988e-03 0.417460985    0.00000000    0.00000000
y_pred[50]     0.226450 2.945649e-03 0.418544091    0.00000000    0.00000000
y_pred[51]     0.221150 2.951101e-03 0.415031674    0.00000000    0.00000000
y_pred[52]     0.221450 2.933195e-03 0.415233089    0.00000000    0.00000000
y_pred[53]     0.224100 2.972537e-03 0.416998662    0.00000000    0.00000000
y_pred[54]     0.226050 2.933566e-03 0.418282375    0.00000000    0.00000000
y_pred[55]     0.220550 2.923994e-03 0.414627897    0.00000000    0.00000000
y_pred[56]     0.221500 2.946476e-03 0.415266628    0.00000000    0.00000000
y_pred[57]     0.221050 2.953622e-03 0.414964465    0.00000000    0.00000000
y_pred[58]     0.217750 2.962057e-03 0.412726852    0.00000000    0.00000000
y_pred[59]     0.230050 3.009651e-03 0.420875105    0.00000000    0.00000000
y_pred[60]     0.226050 2.957008e-03 0.418282375    0.00000000    0.00000000
y_pred[61]     0.222300 2.948953e-03 0.415802062    0.00000000    0.00000000
y_pred[62]     0.228250 2.969724e-03 0.419715077    0.00000000    0.00000000
y_pred[63]     0.220200 2.922335e-03 0.414391778    0.00000000    0.00000000
y_pred[64]     0.224900 2.959416e-03 0.417526893    0.00000000    0.00000000
y_pred[65]     0.221900 2.968686e-03 0.415534624    0.00000000    0.00000000
y_pred[66]     0.220350 2.971855e-03 0.414493025    0.00000000    0.00000000
y_pred[67]     0.227550 2.940024e-03 0.419261000    0.00000000    0.00000000
y_pred[68]     0.218850 2.909803e-03 0.413476995    0.00000000    0.00000000
y_pred[69]     0.223300 2.952348e-03 0.416468225    0.00000000    0.00000000
y_pred[70]     0.221500 2.967637e-03 0.415266628    0.00000000    0.00000000
y_pred[71]     0.218150 2.937145e-03 0.413000128    0.00000000    0.00000000
y_pred[72]     0.217950 2.935334e-03 0.412863561    0.00000000    0.00000000
y_pred[73]     0.221950 2.910391e-03 0.415568084    0.00000000    0.00000000
y_pred[74]     0.218000 2.993783e-03 0.412897716    0.00000000    0.00000000
y_pred[75]     0.218900 2.947174e-03 0.413510991    0.00000000    0.00000000
y_pred[76]     0.227700 2.943023e-03 0.419358442    0.00000000    0.00000000
y_pred[77]     0.223350 2.957240e-03 0.416501442    0.00000000    0.00000000
y_pred[78]     0.222800 3.034714e-03 0.416135577    0.00000000    0.00000000
y_pred[79]     0.227950 3.049939e-03 0.419520676    0.00000000    0.00000000
y_pred[80]     0.220250 2.935911e-03 0.414425536    0.00000000    0.00000000
y_pred[81]     0.219800 2.976396e-03 0.414121401    0.00000000    0.00000000
y_pred[82]     0.224850 2.973309e-03 0.417493943    0.00000000    0.00000000
y_pred[83]     0.219700 3.009583e-03 0.414053719    0.00000000    0.00000000
y_pred[84]     0.226250 2.919262e-03 0.418413302    0.00000000    0.00000000
y_pred[85]     0.222700 2.958971e-03 0.416068943    0.00000000    0.00000000
y_pred[86]     0.227850 2.959218e-03 0.419455808    0.00000000    0.00000000
y_pred[87]     0.226350 2.929340e-03 0.418478714    0.00000000    0.00000000
y_pred[88]     0.226150 2.942412e-03 0.418347856    0.00000000    0.00000000
y_pred[89]     0.226600 2.930203e-03 0.418642094    0.00000000    0.00000000
y_pred[90]     0.224500 2.925102e-03 0.417263053    0.00000000    0.00000000
y_pred[91]     0.225200 3.005192e-03 0.417724412    0.00000000    0.00000000
y_pred[92]     0.220400 2.955762e-03 0.414526756    0.00000000    0.00000000
y_pred[93]     0.229100 2.964594e-03 0.420264228    0.00000000    0.00000000
y_pred[94]     0.222600 2.920557e-03 0.416002275    0.00000000    0.00000000
y_pred[95]     0.225850 3.001070e-03 0.418151312    0.00000000    0.00000000
y_pred[96]     0.227200 3.093590e-03 0.419033339    0.00000000    0.00000000
y_pred[97]     0.222500 2.896002e-03 0.415935572    0.00000000    0.00000000
y_pred[98]     0.226000 2.979310e-03 0.418249622    0.00000000    0.00000000
y_pred[99]     0.221550 2.958056e-03 0.415300158    0.00000000    0.00000000
y_pred[100]    0.224700 2.923998e-03 0.417395042    0.00000000    0.00000000
lp__        -161.075739 1.128568e-02 0.981188895 -163.70037456 -161.46183943
                      50%           75%         97.5%     n_eff      Rhat
alpha         -3.58016164   -3.44091335   -3.18827752  5750.884 1.0006398
beta           0.01163171    0.01263898    0.01457612  6973.526 1.0006847
y_pred[1]      0.00000000    0.00000000    1.00000000 19296.377 1.0000412
y_pred[2]      0.00000000    0.00000000    1.00000000 20359.923 1.0000807
y_pred[3]      0.00000000    0.00000000    1.00000000 19765.391 0.9999933
y_pred[4]      0.00000000    0.00000000    1.00000000 19427.209 1.0001593
y_pred[5]      0.00000000    0.00000000    1.00000000 20050.169 0.9999270
y_pred[6]      0.00000000    0.00000000    1.00000000 20491.055 0.9999884
y_pred[7]      0.00000000    0.00000000    1.00000000 19709.651 1.0000487
y_pred[8]      0.00000000    0.00000000    1.00000000 19466.117 1.0001787
y_pred[9]      0.00000000    0.00000000    1.00000000 19834.975 1.0001784
y_pred[10]     0.00000000    0.00000000    1.00000000 19497.206 1.0001006
y_pred[11]     0.00000000    0.00000000    1.00000000 19994.074 0.9998529
y_pred[12]     0.00000000    0.00000000    1.00000000 19750.441 0.9999568
y_pred[13]     0.00000000    0.00000000    1.00000000 18696.500 0.9998881
y_pred[14]     0.00000000    0.00000000    1.00000000 19790.933 1.0002295
y_pred[15]     0.00000000    0.00000000    1.00000000 19998.356 0.9998915
y_pred[16]     0.00000000    0.00000000    1.00000000 20687.389 0.9999849
y_pred[17]     0.00000000    0.00000000    1.00000000 20116.940 0.9999143
y_pred[18]     0.00000000    0.00000000    1.00000000 20465.449 0.9999872
y_pred[19]     0.00000000    0.00000000    1.00000000 19457.859 1.0002576
y_pred[20]     0.00000000    0.00000000    1.00000000 19847.779 0.9999770
y_pred[21]     0.00000000    0.00000000    1.00000000 19882.441 0.9999243
y_pred[22]     0.00000000    0.00000000    1.00000000 19218.215 1.0000453
y_pred[23]     0.00000000    0.00000000    1.00000000 19693.711 1.0000728
y_pred[24]     0.00000000    0.00000000    1.00000000 20421.546 0.9998594
y_pred[25]     0.00000000    0.00000000    1.00000000 20345.190 0.9999705
y_pred[26]     0.00000000    0.00000000    1.00000000 18844.781 0.9999422
y_pred[27]     0.00000000    0.00000000    1.00000000 19571.487 0.9998819
y_pred[28]     0.00000000    0.00000000    1.00000000 20136.426 0.9998443
y_pred[29]     0.00000000    0.00000000    1.00000000 19199.859 0.9998944
y_pred[30]     0.00000000    0.00000000    1.00000000 19940.036 0.9999048
y_pred[31]     0.00000000    0.00000000    1.00000000 19224.100 1.0001408
y_pred[32]     0.00000000    0.00000000    1.00000000 20433.079 1.0000329
y_pred[33]     0.00000000    0.00000000    1.00000000 19997.791 0.9998992
y_pred[34]     0.00000000    0.00000000    1.00000000 18865.543 1.0000390
y_pred[35]     0.00000000    0.00000000    1.00000000 19428.362 1.0000225
y_pred[36]     0.00000000    0.00000000    1.00000000 20662.671 0.9998453
y_pred[37]     0.00000000    0.00000000    1.00000000 19762.814 0.9999731
y_pred[38]     0.00000000    0.00000000    1.00000000 18745.692 1.0001568
y_pred[39]     0.00000000    0.00000000    1.00000000 20220.985 1.0000054
y_pred[40]     0.00000000    0.00000000    1.00000000 19816.253 1.0000560
y_pred[41]     0.00000000    0.00000000    1.00000000 19416.582 0.9998961
y_pred[42]     0.00000000    0.00000000    1.00000000 20116.509 0.9998954
y_pred[43]     0.00000000    0.00000000    1.00000000 19741.882 1.0000303
y_pred[44]     0.00000000    0.00000000    1.00000000 19996.325 1.0000106
y_pred[45]     0.00000000    0.00000000    1.00000000 18935.671 1.0000720
y_pred[46]     0.00000000    0.00000000    1.00000000 20742.296 0.9998825
y_pred[47]     0.00000000    0.00000000    1.00000000 19817.214 1.0001269
y_pred[48]     0.00000000    0.00000000    1.00000000 19480.564 0.9999453
y_pred[49]     0.00000000    0.00000000    1.00000000 19441.587 1.0000188
y_pred[50]     0.00000000    0.00000000    1.00000000 20189.265 1.0000579
y_pred[51]     0.00000000    0.00000000    1.00000000 19778.544 0.9999151
y_pred[52]     0.00000000    0.00000000    1.00000000 20040.200 0.9999538
y_pred[53]     0.00000000    0.00000000    1.00000000 19679.531 1.0000351
y_pred[54]     0.00000000    0.00000000    1.00000000 20330.474 0.9999845
y_pred[55]     0.00000000    0.00000000    1.00000000 20107.772 0.9999216
y_pred[56]     0.00000000    0.00000000    1.00000000 19863.154 0.9999342
y_pred[57]     0.00000000    0.00000000    1.00000000 19738.402 0.9999275
y_pred[58]     0.00000000    0.00000000    1.00000000 19415.056 0.9999101
y_pred[59]     0.00000000    0.00000000    1.00000000 19555.739 1.0001095
y_pred[60]     0.00000000    0.00000000    1.00000000 20009.397 0.9999273
y_pred[61]     0.00000000    0.00000000    1.00000000 19880.969 1.0002213
y_pred[62]     0.00000000    0.00000000    1.00000000 19974.543 0.9999934
y_pred[63]     0.00000000    0.00000000    1.00000000 20107.692 1.0000200
y_pred[64]     0.00000000    0.00000000    1.00000000 19904.760 1.0000138
y_pred[65]     0.00000000    0.00000000    1.00000000 19592.326 0.9999626
y_pred[66]     0.00000000    0.00000000    1.00000000 19452.671 0.9999312
y_pred[67]     0.00000000    0.00000000    1.00000000 20336.077 1.0002064
y_pred[68]     0.00000000    0.00000000    1.00000000 20191.829 0.9999700
y_pred[69]     0.00000000    0.00000000    1.00000000 19898.886 0.9999392
y_pred[70]     0.00000000    0.00000000    1.00000000 19580.891 0.9999528
y_pred[71]     0.00000000    0.00000000    1.00000000 19771.954 0.9999052
y_pred[72]     0.00000000    0.00000000    1.00000000 19783.274 1.0002751
y_pred[73]     0.00000000    0.00000000    1.00000000 20388.330 1.0000656
y_pred[74]     0.00000000    0.00000000    1.00000000 19021.477 0.9999869
y_pred[75]     0.00000000    0.00000000    1.00000000 19686.226 1.0000014
y_pred[76]     0.00000000    0.00000000    1.00000000 20304.093 0.9999671
y_pred[77]     0.00000000    0.00000000    1.00000000 19836.266 1.0001289
y_pred[78]     0.00000000    0.00000000    1.00000000 18803.304 0.9998945
y_pred[79]     0.00000000    0.00000000    1.00000000 18920.139 1.0001145
y_pred[80]     0.00000000    0.00000000    1.00000000 19925.413 1.0000852
y_pred[81]     0.00000000    0.00000000    1.00000000 19358.603 0.9999709
y_pred[82]     0.00000000    0.00000000    1.00000000 19716.067 0.9999484
y_pred[83]     0.00000000    0.00000000    1.00000000 18927.824 0.9999589
y_pred[84]     0.00000000    0.00000000    1.00000000 20543.042 0.9999693
y_pred[85]     0.00000000    0.00000000    1.00000000 19771.931 0.9998786
y_pred[86]     0.00000000    0.00000000    1.00000000 20091.787 0.9999365
y_pred[87]     0.00000000    0.00000000    1.00000000 20408.323 0.9999951
y_pred[88]     0.00000000    0.00000000    1.00000000 20214.733 0.9999517
y_pred[89]     0.00000000    0.00000000    1.00000000 20412.227 1.0001653
y_pred[90]     0.00000000    0.00000000    1.00000000 20348.755 0.9999953
y_pred[91]     0.00000000    0.00000000    1.00000000 19321.254 0.9998834
y_pred[92]     0.00000000    0.00000000    1.00000000 19668.269 0.9999769
y_pred[93]     0.00000000    0.00000000    1.00000000 20096.223 1.0000091
y_pred[94]     0.00000000    0.00000000    1.00000000 20288.974 0.9999947
y_pred[95]     0.00000000    0.00000000    1.00000000 19413.986 1.0001549
y_pred[96]     0.00000000    0.00000000    1.00000000 18347.284 1.0000757
y_pred[97]     0.00000000    0.00000000    1.00000000 20627.864 1.0000744
y_pred[98]     0.00000000    0.00000000    1.00000000 19707.867 0.9999999
y_pred[99]     0.00000000    0.00000000    1.00000000 19711.128 1.0001849
y_pred[100]    0.00000000    0.00000000    1.00000000 20377.010 0.9999297
lp__        -160.77633141 -160.37609324 -160.11098308  7558.740 1.0009859

$c_summary
, , chains = chain:1

             stats
parameter              mean          sd          2.5%           25%
  alpha         -3.58664955 0.209291032 -4.022928e+00   -3.71895642
  beta           0.01164196 0.001485002  8.686007e-03    0.01066523
  y_pred[1]      0.23020000 0.421002861  0.000000e+00    0.00000000
  y_pred[2]      0.21520000 0.411002122  0.000000e+00    0.00000000
  y_pred[3]      0.22440000 0.417228302  0.000000e+00    0.00000000
  y_pred[4]      0.22420000 0.417096097  0.000000e+00    0.00000000
  y_pred[5]      0.22200000 0.415632711  0.000000e+00    0.00000000
  y_pred[6]      0.23140000 0.421769627  0.000000e+00    0.00000000
  y_pred[7]      0.22460000 0.417360369  0.000000e+00    0.00000000
  y_pred[8]      0.22580000 0.418149889  0.000000e+00    0.00000000
  y_pred[9]      0.21340000 0.409748726  0.000000e+00    0.00000000
  y_pred[10]     0.23200000 0.422151208  0.000000e+00    0.00000000
  y_pred[11]     0.21720000 0.412381100  0.000000e+00    0.00000000
  y_pred[12]     0.23000000 0.420874598  0.000000e+00    0.00000000
  y_pred[13]     0.21820000 0.413065230  0.000000e+00    0.00000000
  y_pred[14]     0.22800000 0.419584569  0.000000e+00    0.00000000
  y_pred[15]     0.21820000 0.413065230  0.000000e+00    0.00000000
  y_pred[16]     0.22960000 0.420617669  0.000000e+00    0.00000000
  y_pred[17]     0.22000000 0.414287734  0.000000e+00    0.00000000
  y_pred[18]     0.23340000 0.423036916  0.000000e+00    0.00000000
  y_pred[19]     0.22560000 0.418018645  0.000000e+00    0.00000000
  y_pred[20]     0.21700000 0.412243847  0.000000e+00    0.00000000
  y_pred[21]     0.22860000 0.419972994  0.000000e+00    0.00000000
  y_pred[22]     0.21860000 0.413337888  0.000000e+00    0.00000000
  y_pred[23]     0.22600000 0.418280996  0.000000e+00    0.00000000
  y_pred[24]     0.22040000 0.414557851  0.000000e+00    0.00000000
  y_pred[25]     0.22720000 0.419064772  0.000000e+00    0.00000000
  y_pred[26]     0.22900000 0.420231268  0.000000e+00    0.00000000
  y_pred[27]     0.22640000 0.418542800  0.000000e+00    0.00000000
  y_pred[28]     0.21940000 0.413881504  0.000000e+00    0.00000000
  y_pred[29]     0.22900000 0.420231268  0.000000e+00    0.00000000
  y_pred[30]     0.22020000 0.414422863  0.000000e+00    0.00000000
  y_pred[31]     0.23100000 0.421514573  0.000000e+00    0.00000000
  y_pred[32]     0.21860000 0.413337888  0.000000e+00    0.00000000
  y_pred[33]     0.22240000 0.415900030  0.000000e+00    0.00000000
  y_pred[34]     0.21640000 0.411831229  0.000000e+00    0.00000000
  y_pred[35]     0.22280000 0.416166792  0.000000e+00    0.00000000
  y_pred[36]     0.22140000 0.415230687  0.000000e+00    0.00000000
  y_pred[37]     0.22840000 0.419843654  0.000000e+00    0.00000000
  y_pred[38]     0.23120000 0.421642166  0.000000e+00    0.00000000
  y_pred[39]     0.23400000 0.423414520  0.000000e+00    0.00000000
  y_pred[40]     0.22180000 0.415498842  0.000000e+00    0.00000000
  y_pred[41]     0.22340000 0.416565896  0.000000e+00    0.00000000
  y_pred[42]     0.22380000 0.416831272  0.000000e+00    0.00000000
  y_pred[43]     0.21980000 0.414152465  0.000000e+00    0.00000000
  y_pred[44]     0.22100000 0.414961973  0.000000e+00    0.00000000
  y_pred[45]     0.22380000 0.416831272  0.000000e+00    0.00000000
  y_pred[46]     0.22500000 0.417624092  0.000000e+00    0.00000000
  y_pred[47]     0.22340000 0.416565896  0.000000e+00    0.00000000
  y_pred[48]     0.21360000 0.409888572  0.000000e+00    0.00000000
  y_pred[49]     0.22100000 0.414961973  0.000000e+00    0.00000000
  y_pred[50]     0.22100000 0.414961973  0.000000e+00    0.00000000
  y_pred[51]     0.22620000 0.418411967  0.000000e+00    0.00000000
  y_pred[52]     0.21920000 0.413745812  0.000000e+00    0.00000000
  y_pred[53]     0.21520000 0.411002122  0.000000e+00    0.00000000
  y_pred[54]     0.21960000 0.414017055  0.000000e+00    0.00000000
  y_pred[55]     0.21720000 0.412381100  0.000000e+00    0.00000000
  y_pred[56]     0.22580000 0.418149889  0.000000e+00    0.00000000
  y_pred[57]     0.22140000 0.415230687  0.000000e+00    0.00000000
  y_pred[58]     0.21240000 0.409047313  0.000000e+00    0.00000000
  y_pred[59]     0.23540000 0.424290990  0.000000e+00    0.00000000
  y_pred[60]     0.22660000 0.418673497  0.000000e+00    0.00000000
  y_pred[61]     0.21340000 0.409748726  0.000000e+00    0.00000000
  y_pred[62]     0.22520000 0.417755747  0.000000e+00    0.00000000
  y_pred[63]     0.22880000 0.420102198  0.000000e+00    0.00000000
  y_pred[64]     0.23000000 0.420874598  0.000000e+00    0.00000000
  y_pred[65]     0.22300000 0.416299965  0.000000e+00    0.00000000
  y_pred[66]     0.21160000 0.408483551  0.000000e+00    0.00000000
  y_pred[67]     0.23340000 0.423036916  0.000000e+00    0.00000000
  y_pred[68]     0.21340000 0.409748726  0.000000e+00    0.00000000
  y_pred[69]     0.22860000 0.419972994  0.000000e+00    0.00000000
  y_pred[70]     0.21880000 0.413474004  0.000000e+00    0.00000000
  y_pred[71]     0.21560000 0.411279066  0.000000e+00    0.00000000
  y_pred[72]     0.21220000 0.408906592  0.000000e+00    0.00000000
  y_pred[73]     0.21960000 0.414017055  0.000000e+00    0.00000000
  y_pred[74]     0.22700000 0.418934483  0.000000e+00    0.00000000
  y_pred[75]     0.21600000 0.411555434  0.000000e+00    0.00000000
  y_pred[76]     0.23640000 0.424913109  0.000000e+00    0.00000000
  y_pred[77]     0.22700000 0.418934483  0.000000e+00    0.00000000
  y_pred[78]     0.22740000 0.419194925  0.000000e+00    0.00000000
  y_pred[79]     0.23020000 0.421002861  0.000000e+00    0.00000000
  y_pred[80]     0.22100000 0.414961973  0.000000e+00    0.00000000
  y_pred[81]     0.21840000 0.413201630  0.000000e+00    0.00000000
  y_pred[82]     0.21940000 0.413881504  0.000000e+00    0.00000000
  y_pred[83]     0.21300000 0.409468598  0.000000e+00    0.00000000
  y_pred[84]     0.22140000 0.415230687  0.000000e+00    0.00000000
  y_pred[85]     0.22000000 0.414287734  0.000000e+00    0.00000000
  y_pred[86]     0.22500000 0.417624092  0.000000e+00    0.00000000
  y_pred[87]     0.21760000 0.412655179  0.000000e+00    0.00000000
  y_pred[88]     0.23240000 0.422404930  0.000000e+00    0.00000000
  y_pred[89]     0.22980000 0.420746201  0.000000e+00    0.00000000
  y_pred[90]     0.22020000 0.414422863  0.000000e+00    0.00000000
  y_pred[91]     0.22080000 0.414827406  0.000000e+00    0.00000000
  y_pred[92]     0.22520000 0.417755747  0.000000e+00    0.00000000
  y_pred[93]     0.22000000 0.414287734  0.000000e+00    0.00000000
  y_pred[94]     0.22920000 0.420360203  0.000000e+00    0.00000000
  y_pred[95]     0.22800000 0.419584569  0.000000e+00    0.00000000
  y_pred[96]     0.22260000 0.416033480  0.000000e+00    0.00000000
  y_pred[97]     0.22680000 0.418804058  0.000000e+00    0.00000000
  y_pred[98]     0.22860000 0.419972994  0.000000e+00    0.00000000
  y_pred[99]     0.21940000 0.413881504  0.000000e+00    0.00000000
  y_pred[100]    0.21980000 0.414152465  0.000000e+00    0.00000000
  lp__        -161.05442587 0.966153706 -1.636303e+02 -161.43090870
             stats
parameter               50%           75%         97.5%
  alpha         -3.57609836   -3.44156608   -3.19852989
  beta           0.01162678    0.01264862    0.01453086
  y_pred[1]      0.00000000    0.00000000    1.00000000
  y_pred[2]      0.00000000    0.00000000    1.00000000
  y_pred[3]      0.00000000    0.00000000    1.00000000
  y_pred[4]      0.00000000    0.00000000    1.00000000
  y_pred[5]      0.00000000    0.00000000    1.00000000
  y_pred[6]      0.00000000    0.00000000    1.00000000
  y_pred[7]      0.00000000    0.00000000    1.00000000
  y_pred[8]      0.00000000    0.00000000    1.00000000
  y_pred[9]      0.00000000    0.00000000    1.00000000
  y_pred[10]     0.00000000    0.00000000    1.00000000
  y_pred[11]     0.00000000    0.00000000    1.00000000
  y_pred[12]     0.00000000    0.00000000    1.00000000
  y_pred[13]     0.00000000    0.00000000    1.00000000
  y_pred[14]     0.00000000    0.00000000    1.00000000
  y_pred[15]     0.00000000    0.00000000    1.00000000
  y_pred[16]     0.00000000    0.00000000    1.00000000
  y_pred[17]     0.00000000    0.00000000    1.00000000
  y_pred[18]     0.00000000    0.00000000    1.00000000
  y_pred[19]     0.00000000    0.00000000    1.00000000
  y_pred[20]     0.00000000    0.00000000    1.00000000
  y_pred[21]     0.00000000    0.00000000    1.00000000
  y_pred[22]     0.00000000    0.00000000    1.00000000
  y_pred[23]     0.00000000    0.00000000    1.00000000
  y_pred[24]     0.00000000    0.00000000    1.00000000
  y_pred[25]     0.00000000    0.00000000    1.00000000
  y_pred[26]     0.00000000    0.00000000    1.00000000
  y_pred[27]     0.00000000    0.00000000    1.00000000
  y_pred[28]     0.00000000    0.00000000    1.00000000
  y_pred[29]     0.00000000    0.00000000    1.00000000
  y_pred[30]     0.00000000    0.00000000    1.00000000
  y_pred[31]     0.00000000    0.00000000    1.00000000
  y_pred[32]     0.00000000    0.00000000    1.00000000
  y_pred[33]     0.00000000    0.00000000    1.00000000
  y_pred[34]     0.00000000    0.00000000    1.00000000
  y_pred[35]     0.00000000    0.00000000    1.00000000
  y_pred[36]     0.00000000    0.00000000    1.00000000
  y_pred[37]     0.00000000    0.00000000    1.00000000
  y_pred[38]     0.00000000    0.00000000    1.00000000
  y_pred[39]     0.00000000    0.00000000    1.00000000
  y_pred[40]     0.00000000    0.00000000    1.00000000
  y_pred[41]     0.00000000    0.00000000    1.00000000
  y_pred[42]     0.00000000    0.00000000    1.00000000
  y_pred[43]     0.00000000    0.00000000    1.00000000
  y_pred[44]     0.00000000    0.00000000    1.00000000
  y_pred[45]     0.00000000    0.00000000    1.00000000
  y_pred[46]     0.00000000    0.00000000    1.00000000
  y_pred[47]     0.00000000    0.00000000    1.00000000
  y_pred[48]     0.00000000    0.00000000    1.00000000
  y_pred[49]     0.00000000    0.00000000    1.00000000
  y_pred[50]     0.00000000    0.00000000    1.00000000
  y_pred[51]     0.00000000    0.00000000    1.00000000
  y_pred[52]     0.00000000    0.00000000    1.00000000
  y_pred[53]     0.00000000    0.00000000    1.00000000
  y_pred[54]     0.00000000    0.00000000    1.00000000
  y_pred[55]     0.00000000    0.00000000    1.00000000
  y_pred[56]     0.00000000    0.00000000    1.00000000
  y_pred[57]     0.00000000    0.00000000    1.00000000
  y_pred[58]     0.00000000    0.00000000    1.00000000
  y_pred[59]     0.00000000    0.00000000    1.00000000
  y_pred[60]     0.00000000    0.00000000    1.00000000
  y_pred[61]     0.00000000    0.00000000    1.00000000
  y_pred[62]     0.00000000    0.00000000    1.00000000
  y_pred[63]     0.00000000    0.00000000    1.00000000
  y_pred[64]     0.00000000    0.00000000    1.00000000
  y_pred[65]     0.00000000    0.00000000    1.00000000
  y_pred[66]     0.00000000    0.00000000    1.00000000
  y_pred[67]     0.00000000    0.00000000    1.00000000
  y_pred[68]     0.00000000    0.00000000    1.00000000
  y_pred[69]     0.00000000    0.00000000    1.00000000
  y_pred[70]     0.00000000    0.00000000    1.00000000
  y_pred[71]     0.00000000    0.00000000    1.00000000
  y_pred[72]     0.00000000    0.00000000    1.00000000
  y_pred[73]     0.00000000    0.00000000    1.00000000
  y_pred[74]     0.00000000    0.00000000    1.00000000
  y_pred[75]     0.00000000    0.00000000    1.00000000
  y_pred[76]     0.00000000    0.00000000    1.00000000
  y_pred[77]     0.00000000    0.00000000    1.00000000
  y_pred[78]     0.00000000    0.00000000    1.00000000
  y_pred[79]     0.00000000    0.00000000    1.00000000
  y_pred[80]     0.00000000    0.00000000    1.00000000
  y_pred[81]     0.00000000    0.00000000    1.00000000
  y_pred[82]     0.00000000    0.00000000    1.00000000
  y_pred[83]     0.00000000    0.00000000    1.00000000
  y_pred[84]     0.00000000    0.00000000    1.00000000
  y_pred[85]     0.00000000    0.00000000    1.00000000
  y_pred[86]     0.00000000    0.00000000    1.00000000
  y_pred[87]     0.00000000    0.00000000    1.00000000
  y_pred[88]     0.00000000    0.00000000    1.00000000
  y_pred[89]     0.00000000    0.00000000    1.00000000
  y_pred[90]     0.00000000    0.00000000    1.00000000
  y_pred[91]     0.00000000    0.00000000    1.00000000
  y_pred[92]     0.00000000    0.00000000    1.00000000
  y_pred[93]     0.00000000    0.00000000    1.00000000
  y_pred[94]     0.00000000    0.00000000    1.00000000
  y_pred[95]     0.00000000    0.00000000    1.00000000
  y_pred[96]     0.00000000    0.00000000    1.00000000
  y_pred[97]     0.00000000    0.00000000    1.00000000
  y_pred[98]     0.00000000    0.00000000    1.00000000
  y_pred[99]     0.00000000    0.00000000    1.00000000
  y_pred[100]    0.00000000    0.00000000    1.00000000
  lp__        -160.74720082 -160.36856097 -160.11043142

, , chains = chain:2

             stats
parameter              mean          sd          2.5%           25%
  alpha         -3.58009001 0.211420338 -4.003340e+00   -3.72166999
  beta           0.01159313 0.001483096  8.694391e-03    0.01057645
  y_pred[1]      0.21980000 0.414152465  0.000000e+00    0.00000000
  y_pred[2]      0.22400000 0.416963754  0.000000e+00    0.00000000
  y_pred[3]      0.22500000 0.417624092  0.000000e+00    0.00000000
  y_pred[4]      0.23080000 0.421386845  0.000000e+00    0.00000000
  y_pred[5]      0.21840000 0.413201630  0.000000e+00    0.00000000
  y_pred[6]      0.22460000 0.417360369  0.000000e+00    0.00000000
  y_pred[7]      0.22160000 0.415364834  0.000000e+00    0.00000000
  y_pred[8]      0.21920000 0.413745812  0.000000e+00    0.00000000
  y_pred[9]      0.22620000 0.418411967  0.000000e+00    0.00000000
  y_pred[10]     0.22620000 0.418411967  0.000000e+00    0.00000000
  y_pred[11]     0.22080000 0.414827406  0.000000e+00    0.00000000
  y_pred[12]     0.23000000 0.420874598  0.000000e+00    0.00000000
  y_pred[13]     0.22240000 0.415900030  0.000000e+00    0.00000000
  y_pred[14]     0.22420000 0.417096097  0.000000e+00    0.00000000
  y_pred[15]     0.22440000 0.417228302  0.000000e+00    0.00000000
  y_pred[16]     0.22900000 0.420231268  0.000000e+00    0.00000000
  y_pred[17]     0.22960000 0.420617669  0.000000e+00    0.00000000
  y_pred[18]     0.22220000 0.415766440  0.000000e+00    0.00000000
  y_pred[19]     0.20900000 0.406635058  0.000000e+00    0.00000000
  y_pred[20]     0.22720000 0.419064772  0.000000e+00    0.00000000
  y_pred[21]     0.22020000 0.414422863  0.000000e+00    0.00000000
  y_pred[22]     0.22500000 0.417624092  0.000000e+00    0.00000000
  y_pred[23]     0.22720000 0.419064772  0.000000e+00    0.00000000
  y_pred[24]     0.22400000 0.416963754  0.000000e+00    0.00000000
  y_pred[25]     0.22100000 0.414961973  0.000000e+00    0.00000000
  y_pred[26]     0.22100000 0.414961973  0.000000e+00    0.00000000
  y_pred[27]     0.21540000 0.411140666  0.000000e+00    0.00000000
  y_pred[28]     0.22520000 0.417755747  0.000000e+00    0.00000000
  y_pred[29]     0.22160000 0.415364834  0.000000e+00    0.00000000
  y_pred[30]     0.22000000 0.414287734  0.000000e+00    0.00000000
  y_pred[31]     0.21440000 0.410446505  0.000000e+00    0.00000000
  y_pred[32]     0.23680000 0.425161043  0.000000e+00    0.00000000
  y_pred[33]     0.22720000 0.419064772  0.000000e+00    0.00000000
  y_pred[34]     0.21520000 0.411002122  0.000000e+00    0.00000000
  y_pred[35]     0.22400000 0.416963754  0.000000e+00    0.00000000
  y_pred[36]     0.22320000 0.416433000  0.000000e+00    0.00000000
  y_pred[37]     0.23060000 0.421258984  0.000000e+00    0.00000000
  y_pred[38]     0.22500000 0.417624092  0.000000e+00    0.00000000
  y_pred[39]     0.23140000 0.421769627  0.000000e+00    0.00000000
  y_pred[40]     0.22680000 0.418804058  0.000000e+00    0.00000000
  y_pred[41]     0.22540000 0.417887265  0.000000e+00    0.00000000
  y_pred[42]     0.21700000 0.412243847  0.000000e+00    0.00000000
  y_pred[43]     0.22340000 0.416565896  0.000000e+00    0.00000000
  y_pred[44]     0.21820000 0.413065230  0.000000e+00    0.00000000
  y_pred[45]     0.22600000 0.418280996  0.000000e+00    0.00000000
  y_pred[46]     0.22560000 0.418018645  0.000000e+00    0.00000000
  y_pred[47]     0.23040000 0.421130989  0.000000e+00    0.00000000
  y_pred[48]     0.22400000 0.416963754  0.000000e+00    0.00000000
  y_pred[49]     0.21980000 0.414152465  0.000000e+00    0.00000000
  y_pred[50]     0.22320000 0.416433000  0.000000e+00    0.00000000
  y_pred[51]     0.22340000 0.416565896  0.000000e+00    0.00000000
  y_pred[52]     0.22500000 0.417624092  0.000000e+00    0.00000000
  y_pred[53]     0.21900000 0.413609979  0.000000e+00    0.00000000
  y_pred[54]     0.23160000 0.421896954  0.000000e+00    0.00000000
  y_pred[55]     0.22160000 0.415364834  0.000000e+00    0.00000000
  y_pred[56]     0.21480000 0.410724602  0.000000e+00    0.00000000
  y_pred[57]     0.21940000 0.413881504  0.000000e+00    0.00000000
  y_pred[58]     0.22120000 0.415096400  0.000000e+00    0.00000000
  y_pred[59]     0.23020000 0.421002861  0.000000e+00    0.00000000
  y_pred[60]     0.22160000 0.415364834  0.000000e+00    0.00000000
  y_pred[61]     0.23180000 0.422024147  0.000000e+00    0.00000000
  y_pred[62]     0.22860000 0.419972994  0.000000e+00    0.00000000
  y_pred[63]     0.22140000 0.415230687  0.000000e+00    0.00000000
  y_pred[64]     0.21720000 0.412381100  0.000000e+00    0.00000000
  y_pred[65]     0.22480000 0.417492299  0.000000e+00    0.00000000
  y_pred[66]     0.22420000 0.417096097  0.000000e+00    0.00000000
  y_pred[67]     0.21940000 0.413881504  0.000000e+00    0.00000000
  y_pred[68]     0.22260000 0.416033480  0.000000e+00    0.00000000
  y_pred[69]     0.22800000 0.419584569  0.000000e+00    0.00000000
  y_pred[70]     0.21960000 0.414017055  0.000000e+00    0.00000000
  y_pred[71]     0.21340000 0.409748726  0.000000e+00    0.00000000
  y_pred[72]     0.21420000 0.410307239  0.000000e+00    0.00000000
  y_pred[73]     0.22320000 0.416433000  0.000000e+00    0.00000000
  y_pred[74]     0.21500000 0.410863434  0.000000e+00    0.00000000
  y_pred[75]     0.22000000 0.414287734  0.000000e+00    0.00000000
  y_pred[76]     0.22920000 0.420360203  0.000000e+00    0.00000000
  y_pred[77]     0.21700000 0.412243847  0.000000e+00    0.00000000
  y_pred[78]     0.22220000 0.415766440  0.000000e+00    0.00000000
  y_pred[79]     0.23240000 0.422404930  0.000000e+00    0.00000000
  y_pred[80]     0.22120000 0.415096400  0.000000e+00    0.00000000
  y_pred[81]     0.21760000 0.412655179  0.000000e+00    0.00000000
  y_pred[82]     0.23400000 0.423414520  0.000000e+00    0.00000000
  y_pred[83]     0.22320000 0.416433000  0.000000e+00    0.00000000
  y_pred[84]     0.22520000 0.417755747  0.000000e+00    0.00000000
  y_pred[85]     0.22180000 0.415498842  0.000000e+00    0.00000000
  y_pred[86]     0.22360000 0.416698653  0.000000e+00    0.00000000
  y_pred[87]     0.23380000 0.423288784  0.000000e+00    0.00000000
  y_pred[88]     0.22260000 0.416033480  0.000000e+00    0.00000000
  y_pred[89]     0.21300000 0.409468598  0.000000e+00    0.00000000
  y_pred[90]     0.22280000 0.416166792  0.000000e+00    0.00000000
  y_pred[91]     0.22820000 0.419714179  0.000000e+00    0.00000000
  y_pred[92]     0.21940000 0.413881504  0.000000e+00    0.00000000
  y_pred[93]     0.23400000 0.423414520  0.000000e+00    0.00000000
  y_pred[94]     0.22420000 0.417096097  0.000000e+00    0.00000000
  y_pred[95]     0.22580000 0.418149889  0.000000e+00    0.00000000
  y_pred[96]     0.22720000 0.419064772  0.000000e+00    0.00000000
  y_pred[97]     0.21800000 0.412928689  0.000000e+00    0.00000000
  y_pred[98]     0.22680000 0.418804058  0.000000e+00    0.00000000
  y_pred[99]     0.23380000 0.423288784  0.000000e+00    0.00000000
  y_pred[100]    0.21800000 0.412928689  0.000000e+00    0.00000000
  lp__        -161.07385294 0.969693235 -1.636585e+02 -161.43673140
             stats
parameter               50%           75%         97.5%
  alpha         -3.57815609   -3.43200331   -3.16650609
  beta           0.01160543    0.01260413    0.01454586
  y_pred[1]      0.00000000    0.00000000    1.00000000
  y_pred[2]      0.00000000    0.00000000    1.00000000
  y_pred[3]      0.00000000    0.00000000    1.00000000
  y_pred[4]      0.00000000    0.00000000    1.00000000
  y_pred[5]      0.00000000    0.00000000    1.00000000
  y_pred[6]      0.00000000    0.00000000    1.00000000
  y_pred[7]      0.00000000    0.00000000    1.00000000
  y_pred[8]      0.00000000    0.00000000    1.00000000
  y_pred[9]      0.00000000    0.00000000    1.00000000
  y_pred[10]     0.00000000    0.00000000    1.00000000
  y_pred[11]     0.00000000    0.00000000    1.00000000
  y_pred[12]     0.00000000    0.00000000    1.00000000
  y_pred[13]     0.00000000    0.00000000    1.00000000
  y_pred[14]     0.00000000    0.00000000    1.00000000
  y_pred[15]     0.00000000    0.00000000    1.00000000
  y_pred[16]     0.00000000    0.00000000    1.00000000
  y_pred[17]     0.00000000    0.00000000    1.00000000
  y_pred[18]     0.00000000    0.00000000    1.00000000
  y_pred[19]     0.00000000    0.00000000    1.00000000
  y_pred[20]     0.00000000    0.00000000    1.00000000
  y_pred[21]     0.00000000    0.00000000    1.00000000
  y_pred[22]     0.00000000    0.00000000    1.00000000
  y_pred[23]     0.00000000    0.00000000    1.00000000
  y_pred[24]     0.00000000    0.00000000    1.00000000
  y_pred[25]     0.00000000    0.00000000    1.00000000
  y_pred[26]     0.00000000    0.00000000    1.00000000
  y_pred[27]     0.00000000    0.00000000    1.00000000
  y_pred[28]     0.00000000    0.00000000    1.00000000
  y_pred[29]     0.00000000    0.00000000    1.00000000
  y_pred[30]     0.00000000    0.00000000    1.00000000
  y_pred[31]     0.00000000    0.00000000    1.00000000
  y_pred[32]     0.00000000    0.00000000    1.00000000
  y_pred[33]     0.00000000    0.00000000    1.00000000
  y_pred[34]     0.00000000    0.00000000    1.00000000
  y_pred[35]     0.00000000    0.00000000    1.00000000
  y_pred[36]     0.00000000    0.00000000    1.00000000
  y_pred[37]     0.00000000    0.00000000    1.00000000
  y_pred[38]     0.00000000    0.00000000    1.00000000
  y_pred[39]     0.00000000    0.00000000    1.00000000
  y_pred[40]     0.00000000    0.00000000    1.00000000
  y_pred[41]     0.00000000    0.00000000    1.00000000
  y_pred[42]     0.00000000    0.00000000    1.00000000
  y_pred[43]     0.00000000    0.00000000    1.00000000
  y_pred[44]     0.00000000    0.00000000    1.00000000
  y_pred[45]     0.00000000    0.00000000    1.00000000
  y_pred[46]     0.00000000    0.00000000    1.00000000
  y_pred[47]     0.00000000    0.00000000    1.00000000
  y_pred[48]     0.00000000    0.00000000    1.00000000
  y_pred[49]     0.00000000    0.00000000    1.00000000
  y_pred[50]     0.00000000    0.00000000    1.00000000
  y_pred[51]     0.00000000    0.00000000    1.00000000
  y_pred[52]     0.00000000    0.00000000    1.00000000
  y_pred[53]     0.00000000    0.00000000    1.00000000
  y_pred[54]     0.00000000    0.00000000    1.00000000
  y_pred[55]     0.00000000    0.00000000    1.00000000
  y_pred[56]     0.00000000    0.00000000    1.00000000
  y_pred[57]     0.00000000    0.00000000    1.00000000
  y_pred[58]     0.00000000    0.00000000    1.00000000
  y_pred[59]     0.00000000    0.00000000    1.00000000
  y_pred[60]     0.00000000    0.00000000    1.00000000
  y_pred[61]     0.00000000    0.00000000    1.00000000
  y_pred[62]     0.00000000    0.00000000    1.00000000
  y_pred[63]     0.00000000    0.00000000    1.00000000
  y_pred[64]     0.00000000    0.00000000    1.00000000
  y_pred[65]     0.00000000    0.00000000    1.00000000
  y_pred[66]     0.00000000    0.00000000    1.00000000
  y_pred[67]     0.00000000    0.00000000    1.00000000
  y_pred[68]     0.00000000    0.00000000    1.00000000
  y_pred[69]     0.00000000    0.00000000    1.00000000
  y_pred[70]     0.00000000    0.00000000    1.00000000
  y_pred[71]     0.00000000    0.00000000    1.00000000
  y_pred[72]     0.00000000    0.00000000    1.00000000
  y_pred[73]     0.00000000    0.00000000    1.00000000
  y_pred[74]     0.00000000    0.00000000    1.00000000
  y_pred[75]     0.00000000    0.00000000    1.00000000
  y_pred[76]     0.00000000    0.00000000    1.00000000
  y_pred[77]     0.00000000    0.00000000    1.00000000
  y_pred[78]     0.00000000    0.00000000    1.00000000
  y_pred[79]     0.00000000    0.00000000    1.00000000
  y_pred[80]     0.00000000    0.00000000    1.00000000
  y_pred[81]     0.00000000    0.00000000    1.00000000
  y_pred[82]     0.00000000    0.00000000    1.00000000
  y_pred[83]     0.00000000    0.00000000    1.00000000
  y_pred[84]     0.00000000    0.00000000    1.00000000
  y_pred[85]     0.00000000    0.00000000    1.00000000
  y_pred[86]     0.00000000    0.00000000    1.00000000
  y_pred[87]     0.00000000    0.00000000    1.00000000
  y_pred[88]     0.00000000    0.00000000    1.00000000
  y_pred[89]     0.00000000    0.00000000    1.00000000
  y_pred[90]     0.00000000    0.00000000    1.00000000
  y_pred[91]     0.00000000    0.00000000    1.00000000
  y_pred[92]     0.00000000    0.00000000    1.00000000
  y_pred[93]     0.00000000    0.00000000    1.00000000
  y_pred[94]     0.00000000    0.00000000    1.00000000
  y_pred[95]     0.00000000    0.00000000    1.00000000
  y_pred[96]     0.00000000    0.00000000    1.00000000
  y_pred[97]     0.00000000    0.00000000    1.00000000
  y_pred[98]     0.00000000    0.00000000    1.00000000
  y_pred[99]     0.00000000    0.00000000    1.00000000
  y_pred[100]    0.00000000    0.00000000    1.00000000
  lp__        -160.78427235 -160.38381362 -160.11240738

, , chains = chain:3

             stats
parameter              mean          sd          2.5%          25%         50%
  alpha         -3.59626185 0.211435362 -4.031046e+00   -3.7349649   -3.590462
  beta           0.01170564 0.001484847  8.830389e-03    0.0106871    0.011687
  y_pred[1]      0.23620000 0.424788947  0.000000e+00    0.0000000    0.000000
  y_pred[2]      0.21840000 0.413201630  0.000000e+00    0.0000000    0.000000
  y_pred[3]      0.21840000 0.413201630  0.000000e+00    0.0000000    0.000000
  y_pred[4]      0.22500000 0.417624092  0.000000e+00    0.0000000    0.000000
  y_pred[5]      0.22940000 0.420489004  0.000000e+00    0.0000000    0.000000
  y_pred[6]      0.21200000 0.408765725  0.000000e+00    0.0000000    0.000000
  y_pred[7]      0.23140000 0.421769627  0.000000e+00    0.0000000    0.000000
  y_pred[8]      0.23000000 0.420874598  0.000000e+00    0.0000000    0.000000
  y_pred[9]      0.23580000 0.424540230  0.000000e+00    0.0000000    0.000000
  y_pred[10]     0.21800000 0.412928689  0.000000e+00    0.0000000    0.000000
  y_pred[11]     0.21980000 0.414152465  0.000000e+00    0.0000000    0.000000
  y_pred[12]     0.22180000 0.415498842  0.000000e+00    0.0000000    0.000000
  y_pred[13]     0.22060000 0.414692698  0.000000e+00    0.0000000    0.000000
  y_pred[14]     0.21840000 0.413201630  0.000000e+00    0.0000000    0.000000
  y_pred[15]     0.21240000 0.409047313  0.000000e+00    0.0000000    0.000000
  y_pred[16]     0.21620000 0.411693403  0.000000e+00    0.0000000    0.000000
  y_pred[17]     0.22200000 0.415632711  0.000000e+00    0.0000000    0.000000
  y_pred[18]     0.22840000 0.419843654  0.000000e+00    0.0000000    0.000000
  y_pred[19]     0.22920000 0.420360203  0.000000e+00    0.0000000    0.000000
  y_pred[20]     0.22280000 0.416166792  0.000000e+00    0.0000000    0.000000
  y_pred[21]     0.22580000 0.418149889  0.000000e+00    0.0000000    0.000000
  y_pred[22]     0.22560000 0.418018645  0.000000e+00    0.0000000    0.000000
  y_pred[23]     0.22280000 0.416166792  0.000000e+00    0.0000000    0.000000
  y_pred[24]     0.21840000 0.413201630  0.000000e+00    0.0000000    0.000000
  y_pred[25]     0.22320000 0.416433000  0.000000e+00    0.0000000    0.000000
  y_pred[26]     0.22460000 0.417360369  0.000000e+00    0.0000000    0.000000
  y_pred[27]     0.21880000 0.413474004  0.000000e+00    0.0000000    0.000000
  y_pred[28]     0.22620000 0.418411967  0.000000e+00    0.0000000    0.000000
  y_pred[29]     0.22680000 0.418804058  0.000000e+00    0.0000000    0.000000
  y_pred[30]     0.22100000 0.414961973  0.000000e+00    0.0000000    0.000000
  y_pred[31]     0.21880000 0.413474004  0.000000e+00    0.0000000    0.000000
  y_pred[32]     0.23000000 0.420874598  0.000000e+00    0.0000000    0.000000
  y_pred[33]     0.22880000 0.420102198  0.000000e+00    0.0000000    0.000000
  y_pred[34]     0.22160000 0.415364834  0.000000e+00    0.0000000    0.000000
  y_pred[35]     0.21420000 0.410307239  0.000000e+00    0.0000000    0.000000
  y_pred[36]     0.21940000 0.413881504  0.000000e+00    0.0000000    0.000000
  y_pred[37]     0.22020000 0.414422863  0.000000e+00    0.0000000    0.000000
  y_pred[38]     0.20940000 0.406921070  0.000000e+00    0.0000000    0.000000
  y_pred[39]     0.21980000 0.414152465  0.000000e+00    0.0000000    0.000000
  y_pred[40]     0.21080000 0.407917442  0.000000e+00    0.0000000    0.000000
  y_pred[41]     0.22100000 0.414961973  0.000000e+00    0.0000000    0.000000
  y_pred[42]     0.22220000 0.415766440  0.000000e+00    0.0000000    0.000000
  y_pred[43]     0.23380000 0.423288784  0.000000e+00    0.0000000    0.000000
  y_pred[44]     0.23420000 0.423540125  0.000000e+00    0.0000000    0.000000
  y_pred[45]     0.22460000 0.417360369  0.000000e+00    0.0000000    0.000000
  y_pred[46]     0.22280000 0.416166792  0.000000e+00    0.0000000    0.000000
  y_pred[47]     0.22500000 0.417624092  0.000000e+00    0.0000000    0.000000
  y_pred[48]     0.21500000 0.410863434  0.000000e+00    0.0000000    0.000000
  y_pred[49]     0.23060000 0.421258984  0.000000e+00    0.0000000    0.000000
  y_pred[50]     0.24000000 0.427125845  0.000000e+00    0.0000000    0.000000
  y_pred[51]     0.21780000 0.412792005  0.000000e+00    0.0000000    0.000000
  y_pred[52]     0.21540000 0.411140666  0.000000e+00    0.0000000    0.000000
  y_pred[53]     0.23040000 0.421130989  0.000000e+00    0.0000000    0.000000
  y_pred[54]     0.23200000 0.422151208  0.000000e+00    0.0000000    0.000000
  y_pred[55]     0.21820000 0.413065230  0.000000e+00    0.0000000    0.000000
  y_pred[56]     0.22200000 0.415632711  0.000000e+00    0.0000000    0.000000
  y_pred[57]     0.21580000 0.411417322  0.000000e+00    0.0000000    0.000000
  y_pred[58]     0.21680000 0.412106450  0.000000e+00    0.0000000    0.000000
  y_pred[59]     0.23640000 0.424913109  0.000000e+00    0.0000000    0.000000
  y_pred[60]     0.23360000 0.423162916  0.000000e+00    0.0000000    0.000000
  y_pred[61]     0.22660000 0.418673497  0.000000e+00    0.0000000    0.000000
  y_pred[62]     0.22420000 0.417096097  0.000000e+00    0.0000000    0.000000
  y_pred[63]     0.21340000 0.409748726  0.000000e+00    0.0000000    0.000000
  y_pred[64]     0.22400000 0.416963754  0.000000e+00    0.0000000    0.000000
  y_pred[65]     0.21660000 0.411968911  0.000000e+00    0.0000000    0.000000
  y_pred[66]     0.22700000 0.418934483  0.000000e+00    0.0000000    0.000000
  y_pred[67]     0.23280000 0.422658122  0.000000e+00    0.0000000    0.000000
  y_pred[68]     0.21660000 0.411968911  0.000000e+00    0.0000000    0.000000
  y_pred[69]     0.22060000 0.414692698  0.000000e+00    0.0000000    0.000000
  y_pred[70]     0.22320000 0.416433000  0.000000e+00    0.0000000    0.000000
  y_pred[71]     0.22380000 0.416831272  0.000000e+00    0.0000000    0.000000
  y_pred[72]     0.23140000 0.421769627  0.000000e+00    0.0000000    0.000000
  y_pred[73]     0.22160000 0.415364834  0.000000e+00    0.0000000    0.000000
  y_pred[74]     0.21000000 0.407348974  0.000000e+00    0.0000000    0.000000
  y_pred[75]     0.21800000 0.412928689  0.000000e+00    0.0000000    0.000000
  y_pred[76]     0.21780000 0.412792005  0.000000e+00    0.0000000    0.000000
  y_pred[77]     0.23400000 0.423414520  0.000000e+00    0.0000000    0.000000
  y_pred[78]     0.22580000 0.418149889  0.000000e+00    0.0000000    0.000000
  y_pred[79]     0.23360000 0.423162916  0.000000e+00    0.0000000    0.000000
  y_pred[80]     0.21800000 0.412928689  0.000000e+00    0.0000000    0.000000
  y_pred[81]     0.22100000 0.414961973  0.000000e+00    0.0000000    0.000000
  y_pred[82]     0.22580000 0.418149889  0.000000e+00    0.0000000    0.000000
  y_pred[83]     0.22480000 0.417492299  0.000000e+00    0.0000000    0.000000
  y_pred[84]     0.22320000 0.416433000  0.000000e+00    0.0000000    0.000000
  y_pred[85]     0.22900000 0.420231268  0.000000e+00    0.0000000    0.000000
  y_pred[86]     0.23080000 0.421386845  0.000000e+00    0.0000000    0.000000
  y_pred[87]     0.22100000 0.414961973  0.000000e+00    0.0000000    0.000000
  y_pred[88]     0.22980000 0.420746201  0.000000e+00    0.0000000    0.000000
  y_pred[89]     0.23840000 0.426147580  0.000000e+00    0.0000000    0.000000
  y_pred[90]     0.22860000 0.419972994  0.000000e+00    0.0000000    0.000000
  y_pred[91]     0.23060000 0.421258984  0.000000e+00    0.0000000    0.000000
  y_pred[92]     0.22460000 0.417360369  0.000000e+00    0.0000000    0.000000
  y_pred[93]     0.23000000 0.420874598  0.000000e+00    0.0000000    0.000000
  y_pred[94]     0.22080000 0.414827406  0.000000e+00    0.0000000    0.000000
  y_pred[95]     0.23140000 0.421769627  0.000000e+00    0.0000000    0.000000
  y_pred[96]     0.23720000 0.425408456  0.000000e+00    0.0000000    0.000000
  y_pred[97]     0.22740000 0.419194925  0.000000e+00    0.0000000    0.000000
  y_pred[98]     0.23040000 0.421130989  0.000000e+00    0.0000000    0.000000
  y_pred[99]     0.21380000 0.410028273  0.000000e+00    0.0000000    0.000000
  y_pred[100]    0.22760000 0.419324942  0.000000e+00    0.0000000    0.000000
  lp__        -161.08888887 0.998675899 -1.636752e+02 -161.4835253 -160.784328
             stats
parameter               75%         97.5%
  alpha         -3.44846283   -3.20776066
  beta           0.01273215    0.01464098
  y_pred[1]      0.00000000    1.00000000
  y_pred[2]      0.00000000    1.00000000
  y_pred[3]      0.00000000    1.00000000
  y_pred[4]      0.00000000    1.00000000
  y_pred[5]      0.00000000    1.00000000
  y_pred[6]      0.00000000    1.00000000
  y_pred[7]      0.00000000    1.00000000
  y_pred[8]      0.00000000    1.00000000
  y_pred[9]      0.00000000    1.00000000
  y_pred[10]     0.00000000    1.00000000
  y_pred[11]     0.00000000    1.00000000
  y_pred[12]     0.00000000    1.00000000
  y_pred[13]     0.00000000    1.00000000
  y_pred[14]     0.00000000    1.00000000
  y_pred[15]     0.00000000    1.00000000
  y_pred[16]     0.00000000    1.00000000
  y_pred[17]     0.00000000    1.00000000
  y_pred[18]     0.00000000    1.00000000
  y_pred[19]     0.00000000    1.00000000
  y_pred[20]     0.00000000    1.00000000
  y_pred[21]     0.00000000    1.00000000
  y_pred[22]     0.00000000    1.00000000
  y_pred[23]     0.00000000    1.00000000
  y_pred[24]     0.00000000    1.00000000
  y_pred[25]     0.00000000    1.00000000
  y_pred[26]     0.00000000    1.00000000
  y_pred[27]     0.00000000    1.00000000
  y_pred[28]     0.00000000    1.00000000
  y_pred[29]     0.00000000    1.00000000
  y_pred[30]     0.00000000    1.00000000
  y_pred[31]     0.00000000    1.00000000
  y_pred[32]     0.00000000    1.00000000
  y_pred[33]     0.00000000    1.00000000
  y_pred[34]     0.00000000    1.00000000
  y_pred[35]     0.00000000    1.00000000
  y_pred[36]     0.00000000    1.00000000
  y_pred[37]     0.00000000    1.00000000
  y_pred[38]     0.00000000    1.00000000
  y_pred[39]     0.00000000    1.00000000
  y_pred[40]     0.00000000    1.00000000
  y_pred[41]     0.00000000    1.00000000
  y_pred[42]     0.00000000    1.00000000
  y_pred[43]     0.00000000    1.00000000
  y_pred[44]     0.00000000    1.00000000
  y_pred[45]     0.00000000    1.00000000
  y_pred[46]     0.00000000    1.00000000
  y_pred[47]     0.00000000    1.00000000
  y_pred[48]     0.00000000    1.00000000
  y_pred[49]     0.00000000    1.00000000
  y_pred[50]     0.00000000    1.00000000
  y_pred[51]     0.00000000    1.00000000
  y_pred[52]     0.00000000    1.00000000
  y_pred[53]     0.00000000    1.00000000
  y_pred[54]     0.00000000    1.00000000
  y_pred[55]     0.00000000    1.00000000
  y_pred[56]     0.00000000    1.00000000
  y_pred[57]     0.00000000    1.00000000
  y_pred[58]     0.00000000    1.00000000
  y_pred[59]     0.00000000    1.00000000
  y_pred[60]     0.00000000    1.00000000
  y_pred[61]     0.00000000    1.00000000
  y_pred[62]     0.00000000    1.00000000
  y_pred[63]     0.00000000    1.00000000
  y_pred[64]     0.00000000    1.00000000
  y_pred[65]     0.00000000    1.00000000
  y_pred[66]     0.00000000    1.00000000
  y_pred[67]     0.00000000    1.00000000
  y_pred[68]     0.00000000    1.00000000
  y_pred[69]     0.00000000    1.00000000
  y_pred[70]     0.00000000    1.00000000
  y_pred[71]     0.00000000    1.00000000
  y_pred[72]     0.00000000    1.00000000
  y_pred[73]     0.00000000    1.00000000
  y_pred[74]     0.00000000    1.00000000
  y_pred[75]     0.00000000    1.00000000
  y_pred[76]     0.00000000    1.00000000
  y_pred[77]     0.00000000    1.00000000
  y_pred[78]     0.00000000    1.00000000
  y_pred[79]     0.00000000    1.00000000
  y_pred[80]     0.00000000    1.00000000
  y_pred[81]     0.00000000    1.00000000
  y_pred[82]     0.00000000    1.00000000
  y_pred[83]     0.00000000    1.00000000
  y_pred[84]     0.00000000    1.00000000
  y_pred[85]     0.00000000    1.00000000
  y_pred[86]     0.00000000    1.00000000
  y_pred[87]     0.00000000    1.00000000
  y_pred[88]     0.00000000    1.00000000
  y_pred[89]     0.00000000    1.00000000
  y_pred[90]     0.00000000    1.00000000
  y_pred[91]     0.00000000    1.00000000
  y_pred[92]     0.00000000    1.00000000
  y_pred[93]     0.00000000    1.00000000
  y_pred[94]     0.00000000    1.00000000
  y_pred[95]     0.00000000    1.00000000
  y_pred[96]     0.00000000    1.00000000
  y_pred[97]     0.00000000    1.00000000
  y_pred[98]     0.00000000    1.00000000
  y_pred[99]     0.00000000    1.00000000
  y_pred[100]    0.00000000    1.00000000
  lp__        -160.37876368 -160.11232189

, , chains = chain:4

             stats
parameter              mean          sd          2.5%           25%
  alpha         -3.58524741 0.214041455 -4.014947e+00   -3.72714404
  beta           0.01162728 0.001481988  8.699375e-03    0.01066153
  y_pred[1]      0.22000000 0.414287734  0.000000e+00    0.00000000
  y_pred[2]      0.23320000 0.422910783  0.000000e+00    0.00000000
  y_pred[3]      0.21960000 0.414017055  0.000000e+00    0.00000000
  y_pred[4]      0.20700000 0.405196047  0.000000e+00    0.00000000
  y_pred[5]      0.21700000 0.412243847  0.000000e+00    0.00000000
  y_pred[6]      0.22220000 0.415766440  0.000000e+00    0.00000000
  y_pred[7]      0.22460000 0.417360369  0.000000e+00    0.00000000
  y_pred[8]      0.21380000 0.410028273  0.000000e+00    0.00000000
  y_pred[9]      0.23280000 0.422658122  0.000000e+00    0.00000000
  y_pred[10]     0.21820000 0.413065230  0.000000e+00    0.00000000
  y_pred[11]     0.22360000 0.416698653  0.000000e+00    0.00000000
  y_pred[12]     0.21900000 0.413609979  0.000000e+00    0.00000000
  y_pred[13]     0.23000000 0.420874598  0.000000e+00    0.00000000
  y_pred[14]     0.22040000 0.414557851  0.000000e+00    0.00000000
  y_pred[15]     0.22160000 0.415364834  0.000000e+00    0.00000000
  y_pred[16]     0.22120000 0.415096400  0.000000e+00    0.00000000
  y_pred[17]     0.22540000 0.417887265  0.000000e+00    0.00000000
  y_pred[18]     0.22000000 0.414287734  0.000000e+00    0.00000000
  y_pred[19]     0.22440000 0.417228302  0.000000e+00    0.00000000
  y_pred[20]     0.21820000 0.413065230  0.000000e+00    0.00000000
  y_pred[21]     0.23000000 0.420874598  0.000000e+00    0.00000000
  y_pred[22]     0.23000000 0.420874598  0.000000e+00    0.00000000
  y_pred[23]     0.22520000 0.417755747  0.000000e+00    0.00000000
  y_pred[24]     0.22480000 0.417492299  0.000000e+00    0.00000000
  y_pred[25]     0.22340000 0.416565896  0.000000e+00    0.00000000
  y_pred[26]     0.21980000 0.414152465  0.000000e+00    0.00000000
  y_pred[27]     0.21900000 0.413609979  0.000000e+00    0.00000000
  y_pred[28]     0.21820000 0.413065230  0.000000e+00    0.00000000
  y_pred[29]     0.23040000 0.421130989  0.000000e+00    0.00000000
  y_pred[30]     0.23020000 0.421002861  0.000000e+00    0.00000000
  y_pred[31]     0.23020000 0.421002861  0.000000e+00    0.00000000
  y_pred[32]     0.21980000 0.414152465  0.000000e+00    0.00000000
  y_pred[33]     0.22500000 0.417624092  0.000000e+00    0.00000000
  y_pred[34]     0.23520000 0.424166174  0.000000e+00    0.00000000
  y_pred[35]     0.22200000 0.415632711  0.000000e+00    0.00000000
  y_pred[36]     0.21880000 0.413474004  0.000000e+00    0.00000000
  y_pred[37]     0.22180000 0.415498842  0.000000e+00    0.00000000
  y_pred[38]     0.22080000 0.414827406  0.000000e+00    0.00000000
  y_pred[39]     0.22340000 0.416565896  0.000000e+00    0.00000000
  y_pred[40]     0.23100000 0.421514573  0.000000e+00    0.00000000
  y_pred[41]     0.22720000 0.419064772  0.000000e+00    0.00000000
  y_pred[42]     0.21680000 0.412106450  0.000000e+00    0.00000000
  y_pred[43]     0.22320000 0.416433000  0.000000e+00    0.00000000
  y_pred[44]     0.22560000 0.418018645  0.000000e+00    0.00000000
  y_pred[45]     0.22880000 0.420102198  0.000000e+00    0.00000000
  y_pred[46]     0.21780000 0.412792005  0.000000e+00    0.00000000
  y_pred[47]     0.21780000 0.412792005  0.000000e+00    0.00000000
  y_pred[48]     0.22300000 0.416299965  0.000000e+00    0.00000000
  y_pred[49]     0.22780000 0.419454823  0.000000e+00    0.00000000
  y_pred[50]     0.22160000 0.415364834  0.000000e+00    0.00000000
  y_pred[51]     0.21720000 0.412381100  0.000000e+00    0.00000000
  y_pred[52]     0.22620000 0.418411967  0.000000e+00    0.00000000
  y_pred[53]     0.23180000 0.422024147  0.000000e+00    0.00000000
  y_pred[54]     0.22100000 0.414961973  0.000000e+00    0.00000000
  y_pred[55]     0.22520000 0.417755747  0.000000e+00    0.00000000
  y_pred[56]     0.22340000 0.416565896  0.000000e+00    0.00000000
  y_pred[57]     0.22760000 0.419324942  0.000000e+00    0.00000000
  y_pred[58]     0.22060000 0.414692698  0.000000e+00    0.00000000
  y_pred[59]     0.21820000 0.413065230  0.000000e+00    0.00000000
  y_pred[60]     0.22240000 0.415900030  0.000000e+00    0.00000000
  y_pred[61]     0.21740000 0.412518211  0.000000e+00    0.00000000
  y_pred[62]     0.23500000 0.424041227  0.000000e+00    0.00000000
  y_pred[63]     0.21720000 0.412381100  0.000000e+00    0.00000000
  y_pred[64]     0.22840000 0.419843654  0.000000e+00    0.00000000
  y_pred[65]     0.22320000 0.416433000  0.000000e+00    0.00000000
  y_pred[66]     0.21860000 0.413337888  0.000000e+00    0.00000000
  y_pred[67]     0.22460000 0.417360369  0.000000e+00    0.00000000
  y_pred[68]     0.22280000 0.416166792  0.000000e+00    0.00000000
  y_pred[69]     0.21600000 0.411555434  0.000000e+00    0.00000000
  y_pred[70]     0.22440000 0.417228302  0.000000e+00    0.00000000
  y_pred[71]     0.21980000 0.414152465  0.000000e+00    0.00000000
  y_pred[72]     0.21400000 0.410167828  0.000000e+00    0.00000000
  y_pred[73]     0.22340000 0.416565896  0.000000e+00    0.00000000
  y_pred[74]     0.22000000 0.414287734  0.000000e+00    0.00000000
  y_pred[75]     0.22160000 0.415364834  0.000000e+00    0.00000000
  y_pred[76]     0.22740000 0.419194925  0.000000e+00    0.00000000
  y_pred[77]     0.21540000 0.411140666  0.000000e+00    0.00000000
  y_pred[78]     0.21580000 0.411417322  0.000000e+00    0.00000000
  y_pred[79]     0.21560000 0.411279066  0.000000e+00    0.00000000
  y_pred[80]     0.22080000 0.414827406  0.000000e+00    0.00000000
  y_pred[81]     0.22220000 0.415766440  0.000000e+00    0.00000000
  y_pred[82]     0.22020000 0.414422863  0.000000e+00    0.00000000
  y_pred[83]     0.21780000 0.412792005  0.000000e+00    0.00000000
  y_pred[84]     0.23520000 0.424166174  0.000000e+00    0.00000000
  y_pred[85]     0.22000000 0.414287734  0.000000e+00    0.00000000
  y_pred[86]     0.23200000 0.422151208  0.000000e+00    0.00000000
  y_pred[87]     0.23300000 0.422784519  0.000000e+00    0.00000000
  y_pred[88]     0.21980000 0.414152465  0.000000e+00    0.00000000
  y_pred[89]     0.22520000 0.417755747  0.000000e+00    0.00000000
  y_pred[90]     0.22640000 0.418542800  0.000000e+00    0.00000000
  y_pred[91]     0.22120000 0.415096400  0.000000e+00    0.00000000
  y_pred[92]     0.21240000 0.409047313  0.000000e+00    0.00000000
  y_pred[93]     0.23240000 0.422404930  0.000000e+00    0.00000000
  y_pred[94]     0.21620000 0.411693403  0.000000e+00    0.00000000
  y_pred[95]     0.21820000 0.413065230  0.000000e+00    0.00000000
  y_pred[96]     0.22180000 0.415498842  0.000000e+00    0.00000000
  y_pred[97]     0.21780000 0.412792005  0.000000e+00    0.00000000
  y_pred[98]     0.21820000 0.413065230  0.000000e+00    0.00000000
  y_pred[99]     0.21920000 0.413745812  0.000000e+00    0.00000000
  y_pred[100]    0.23340000 0.423036916  0.000000e+00    0.00000000
  lp__        -161.08578738 0.989778257 -1.637699e+02 -161.47302928
             stats
parameter               50%           75%         97.5%
  alpha         -3.57729975   -3.44062084   -3.18630558
  beta           0.01160923    0.01258724    0.01463712
  y_pred[1]      0.00000000    0.00000000    1.00000000
  y_pred[2]      0.00000000    0.00000000    1.00000000
  y_pred[3]      0.00000000    0.00000000    1.00000000
  y_pred[4]      0.00000000    0.00000000    1.00000000
  y_pred[5]      0.00000000    0.00000000    1.00000000
  y_pred[6]      0.00000000    0.00000000    1.00000000
  y_pred[7]      0.00000000    0.00000000    1.00000000
  y_pred[8]      0.00000000    0.00000000    1.00000000
  y_pred[9]      0.00000000    0.00000000    1.00000000
  y_pred[10]     0.00000000    0.00000000    1.00000000
  y_pred[11]     0.00000000    0.00000000    1.00000000
  y_pred[12]     0.00000000    0.00000000    1.00000000
  y_pred[13]     0.00000000    0.00000000    1.00000000
  y_pred[14]     0.00000000    0.00000000    1.00000000
  y_pred[15]     0.00000000    0.00000000    1.00000000
  y_pred[16]     0.00000000    0.00000000    1.00000000
  y_pred[17]     0.00000000    0.00000000    1.00000000
  y_pred[18]     0.00000000    0.00000000    1.00000000
  y_pred[19]     0.00000000    0.00000000    1.00000000
  y_pred[20]     0.00000000    0.00000000    1.00000000
  y_pred[21]     0.00000000    0.00000000    1.00000000
  y_pred[22]     0.00000000    0.00000000    1.00000000
  y_pred[23]     0.00000000    0.00000000    1.00000000
  y_pred[24]     0.00000000    0.00000000    1.00000000
  y_pred[25]     0.00000000    0.00000000    1.00000000
  y_pred[26]     0.00000000    0.00000000    1.00000000
  y_pred[27]     0.00000000    0.00000000    1.00000000
  y_pred[28]     0.00000000    0.00000000    1.00000000
  y_pred[29]     0.00000000    0.00000000    1.00000000
  y_pred[30]     0.00000000    0.00000000    1.00000000
  y_pred[31]     0.00000000    0.00000000    1.00000000
  y_pred[32]     0.00000000    0.00000000    1.00000000
  y_pred[33]     0.00000000    0.00000000    1.00000000
  y_pred[34]     0.00000000    0.00000000    1.00000000
  y_pred[35]     0.00000000    0.00000000    1.00000000
  y_pred[36]     0.00000000    0.00000000    1.00000000
  y_pred[37]     0.00000000    0.00000000    1.00000000
  y_pred[38]     0.00000000    0.00000000    1.00000000
  y_pred[39]     0.00000000    0.00000000    1.00000000
  y_pred[40]     0.00000000    0.00000000    1.00000000
  y_pred[41]     0.00000000    0.00000000    1.00000000
  y_pred[42]     0.00000000    0.00000000    1.00000000
  y_pred[43]     0.00000000    0.00000000    1.00000000
  y_pred[44]     0.00000000    0.00000000    1.00000000
  y_pred[45]     0.00000000    0.00000000    1.00000000
  y_pred[46]     0.00000000    0.00000000    1.00000000
  y_pred[47]     0.00000000    0.00000000    1.00000000
  y_pred[48]     0.00000000    0.00000000    1.00000000
  y_pred[49]     0.00000000    0.00000000    1.00000000
  y_pred[50]     0.00000000    0.00000000    1.00000000
  y_pred[51]     0.00000000    0.00000000    1.00000000
  y_pred[52]     0.00000000    0.00000000    1.00000000
  y_pred[53]     0.00000000    0.00000000    1.00000000
  y_pred[54]     0.00000000    0.00000000    1.00000000
  y_pred[55]     0.00000000    0.00000000    1.00000000
  y_pred[56]     0.00000000    0.00000000    1.00000000
  y_pred[57]     0.00000000    0.00000000    1.00000000
  y_pred[58]     0.00000000    0.00000000    1.00000000
  y_pred[59]     0.00000000    0.00000000    1.00000000
  y_pred[60]     0.00000000    0.00000000    1.00000000
  y_pred[61]     0.00000000    0.00000000    1.00000000
  y_pred[62]     0.00000000    0.00000000    1.00000000
  y_pred[63]     0.00000000    0.00000000    1.00000000
  y_pred[64]     0.00000000    0.00000000    1.00000000
  y_pred[65]     0.00000000    0.00000000    1.00000000
  y_pred[66]     0.00000000    0.00000000    1.00000000
  y_pred[67]     0.00000000    0.00000000    1.00000000
  y_pred[68]     0.00000000    0.00000000    1.00000000
  y_pred[69]     0.00000000    0.00000000    1.00000000
  y_pred[70]     0.00000000    0.00000000    1.00000000
  y_pred[71]     0.00000000    0.00000000    1.00000000
  y_pred[72]     0.00000000    0.00000000    1.00000000
  y_pred[73]     0.00000000    0.00000000    1.00000000
  y_pred[74]     0.00000000    0.00000000    1.00000000
  y_pred[75]     0.00000000    0.00000000    1.00000000
  y_pred[76]     0.00000000    0.00000000    1.00000000
  y_pred[77]     0.00000000    0.00000000    1.00000000
  y_pred[78]     0.00000000    0.00000000    1.00000000
  y_pred[79]     0.00000000    0.00000000    1.00000000
  y_pred[80]     0.00000000    0.00000000    1.00000000
  y_pred[81]     0.00000000    0.00000000    1.00000000
  y_pred[82]     0.00000000    0.00000000    1.00000000
  y_pred[83]     0.00000000    0.00000000    1.00000000
  y_pred[84]     0.00000000    0.00000000    1.00000000
  y_pred[85]     0.00000000    0.00000000    1.00000000
  y_pred[86]     0.00000000    0.00000000    1.00000000
  y_pred[87]     0.00000000    0.00000000    1.00000000
  y_pred[88]     0.00000000    0.00000000    1.00000000
  y_pred[89]     0.00000000    0.00000000    1.00000000
  y_pred[90]     0.00000000    0.00000000    1.00000000
  y_pred[91]     0.00000000    0.00000000    1.00000000
  y_pred[92]     0.00000000    0.00000000    1.00000000
  y_pred[93]     0.00000000    0.00000000    1.00000000
  y_pred[94]     0.00000000    0.00000000    1.00000000
  y_pred[95]     0.00000000    0.00000000    1.00000000
  y_pred[96]     0.00000000    0.00000000    1.00000000
  y_pred[97]     0.00000000    0.00000000    1.00000000
  y_pred[98]     0.00000000    0.00000000    1.00000000
  y_pred[99]     0.00000000    0.00000000    1.00000000
  y_pred[100]    0.00000000    0.00000000    1.00000000
  lp__        -160.78964362 -160.37726134 -160.10980101
# Extraer las estimaciones de los parámetros
estimaciones_mortality <- extract(fit_mortality)
alpha_est <- estimaciones_mortality$alpha
beta_est <- estimaciones_mortality$beta

# Extraer las predicciones generadas por el modelo para los nuevos mineros
y_pred_mortality <- estimaciones_mortality$y_pred

# Calcular la media y la desviación estándar de las predicciones para nuevos mineros
mean_pred_mortality <- colMeans(y_pred_mortality)
sd_pred_mortality <- apply(y_pred_mortality, 2, sd)

# Mostrar los resultados para los nuevos mineros
cat("La media predictiva de muertes para nuevos mineros es:", mean(mean_pred_mortality), "\n")
La media predictiva de muertes para nuevos mineros es: 0.223392 
cat("La desviación estándar predictiva de muertes para nuevos mineros es:", mean(sd_pred_mortality), "\n")
La desviación estándar predictiva de muertes para nuevos mineros es: 0.4165162 
Gráficas e Interpretación:
# Trace plot de las cadenas MCMC para beta0 y beta1
mcmc_trace(as.array(fit_mortality), pars = c("alpha", "beta"))

# Densidades posteriores de alpha y beta
mcmc_dens_overlay(as.array(fit_mortality), pars = c("alpha", "beta"))

# Relación entre tiempo de exposición y probabilidad de muerte
ggplot(datos_mortality, aes(x = x, y = y/n)) + 
  geom_point() + 
  stat_smooth(method = "glm", method.args = list(family = "binomial"), se = FALSE)
`geom_smooth()` using formula = 'y ~ x'
Warning in eval(family$initialize): non-integer #successes in a binomial glm!

# Intervalos de credibilidad para alpha y beta
mcmc_intervals(as.array(fit_mortality), pars = c("alpha", "beta"))

Interpretación de alpha (Intercepto): La estimación para alpha está ajustada alrededor de cero con un intervalo confiable y estrecho, lo que implica que el riesgo inicial de fallecimiento, independiente del tiempo de exposición, es mínimo y se ha determinado con alta precisión en el modelo.

Interpretación de beta (Pendiente): Beta se estima también en torno a cero y su intervalo de credibilidad es reducido, señalando una ausencia de influencia significativa del tiempo de exposición sobre la probabilidad de muerte, basado en el espectro de exposición presente en los datos. Esta estimación de pendiente se calcula con gran exactitud.

Predicciones del Modelo: La media predictiva de muertes es de cerca de 0.223, lo que refleja la expectativa del modelo de que, de cada 100 mineros expuestos 200 horas al mineral, alrededor de 22 podrían fallecer, según los datos históricos y las suposiciones del modelo. La desviación estándar predictiva baja de 0.416 comunica una fuerte seguridad en la media calculada y una variabilidad baja en las proyecciones de muertes del modelo.

# PPC 

# Extraer las muestras de los parámetros del modelo
samples <- extract(fit_mortality)

# Definir la función inv_logit
inv_logit <- function(x) {
  exp(x) / (1 + exp(x))
}

# Generar predicciones posteriores para cada observación
n_obs <- length(datos_mortality$y)  # Número de observaciones
n_samples <- dim(samples$alpha)[1]   # Número de muestras en la cadena MCMC
yrep <- matrix(NA, nrow = n_obs, ncol = n_samples)  # Matriz para almacenar predicciones

for (i in 1:n_samples) {
  mu <- samples$alpha[i] + samples$beta[i] * datos_mortality$x
  yrep[, i] <- rbinom(n_obs, datos_mortality$n, inv_logit(mu))  # Muestreo de distribución binomial usando inversa de la función logit
}

# Calcular la media de las predicciones
predicciones_media <- apply(yrep, 1, mean)

# Asegúrate de que 'datos_mortality$y' y 'predicciones_media' tienen la misma longitud
if (length(datos_mortality$y) == length(predicciones_media)) {
  # Crea el gráfico de dispersión
  plot(datos_mortality$y, predicciones_media,
       xlab = "Valores Observados",
       ylab = "Media de Valores Predichos",
       main = "PPC: Valores Observados vs. Media de Valores Predichos")
  
  # Añade la línea roja usando abline()
  abline(a = 0, b = 1, col = "red")
} else {
  stop("La longitud de los datos observados y las predicciones no coincide.")
}

El gráfico muestra cómo se comparan las muertes observadas con las predicciones del modelo. La mayoría de los puntos están cerca de la línea roja, lo que indica una buena concordancia entre los valores observados y predichos. Sin embargo, hay algunas desviaciones, lo que sugiere que el modelo podría mejorarse en ciertas áreas.

Parte 2: “inciso a”

Inciso i)
# install.packages(boot)
library(boot)

Attaching package: 'boot'
The following object is masked from 'package:rstanarm':

    logit
# Cargar datos
data(coal)

# Convertir las fechas de 'coal' en años y contar los desastres por año
year <- floor(coal)  # 'coal' es un vector
disasters <- table(year)  # Conteo de desastres por año

# Crear el vector de años y el vector de conteos
years <- as.numeric(names(disasters))
counts <- as.numeric(disasters)

# Gráfico de dispersión
plot(years, counts, type = "p", col = "blue", xlab = "Año", ylab = "Número de desastres",
     main = "Número de desastres por año")

# Compilar y ajustar el modelo de Stan
stan_model_muertes <- stan_model(file = "Ej5-modelo2.stan")

fit_muertes <- sampling(stan_model_muertes, 
                data = list(N = length(counts), y = counts, x = years), 
                iter = 10000, 
                chains = 4)

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 1.3e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.13 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 1: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 1: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 1: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 1: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 1: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 1: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 1: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 1: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 1: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 1: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 1: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 3.309 seconds (Warm-up)
Chain 1:                2.949 seconds (Sampling)
Chain 1:                6.258 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 2).
Chain 2: Rejecting initial value:
Chain 2:   Log probability evaluates to log(0), i.e. negative infinity.
Chain 2:   Stan can't start sampling from this initial value.
Chain 2: Rejecting initial value:
Chain 2:   Log probability evaluates to log(0), i.e. negative infinity.
Chain 2:   Stan can't start sampling from this initial value.
Chain 2: Rejecting initial value:
Chain 2:   Log probability evaluates to log(0), i.e. negative infinity.
Chain 2:   Stan can't start sampling from this initial value.
Chain 2: 
Chain 2: Gradient evaluation took 6e-06 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.06 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 2: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 2: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 2: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 2: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 2: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 2: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 2: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 2: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 2: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 2: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 2: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 3.123 seconds (Warm-up)
Chain 2:                2.906 seconds (Sampling)
Chain 2:                6.029 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 6e-06 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.06 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 3: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 3: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 3: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 3: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 3: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 3: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 3: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 3: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 3: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 3: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 3: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 3.033 seconds (Warm-up)
Chain 3:                2.729 seconds (Sampling)
Chain 3:                5.762 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 4).
Chain 4: Rejecting initial value:
Chain 4:   Log probability evaluates to log(0), i.e. negative infinity.
Chain 4:   Stan can't start sampling from this initial value.
Chain 4: Rejecting initial value:
Chain 4:   Log probability evaluates to log(0), i.e. negative infinity.
Chain 4:   Stan can't start sampling from this initial value.
Chain 4: 
Chain 4: Gradient evaluation took 5e-06 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.05 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 4: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 4: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 4: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 4: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 4: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 4: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 4: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 4: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 4: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 4: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 4: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 3.043 seconds (Warm-up)
Chain 4:                3.304 seconds (Sampling)
Chain 4:                6.347 seconds (Total)
Chain 4: 
# Ver los resultados del modelo
print(fit_muertes)
Inference for Stan model: anon_model.
4 chains, each with iter=10000; warmup=5000; thin=1; 
post-warmup draws per chain=5000, total post-warmup draws=20000.

             mean se_mean   sd   2.5%    25%    50%    75%  97.5% n_eff Rhat
alpha       22.80    0.10 4.82  13.70  19.46  22.73  26.08  32.49  2514    1
beta        -0.01    0.00 0.00  -0.02  -0.01  -0.01  -0.01  -0.01  2513    1
y_pred[1]    3.99    0.02 2.04   1.00   3.00   4.00   5.00   8.00 16646    1
y_pred[2]    3.92    0.02 2.05   1.00   2.00   4.00   5.00   8.00 15520    1
y_pred[3]    3.87    0.01 2.03   1.00   2.00   4.00   5.00   8.00 18620    1
y_pred[4]    3.84    0.02 2.01   1.00   2.00   4.00   5.00   8.00 17664    1
y_pred[5]    3.74    0.02 1.99   0.00   2.00   4.00   5.00   8.00 17495    1
y_pred[6]    3.70    0.01 1.96   0.00   2.00   3.00   5.00   8.00 18955    1
y_pred[7]    3.66    0.01 1.96   0.00   2.00   3.00   5.00   8.00 18468    1
y_pred[8]    3.57    0.01 1.92   0.00   2.00   3.00   5.00   8.00 17517    1
y_pred[9]    3.56    0.01 1.92   0.00   2.00   3.00   5.00   8.00 17359    1
y_pred[10]   3.49    0.01 1.89   0.00   2.00   3.00   5.00   8.00 19481    1
y_pred[11]   3.45    0.01 1.88   0.00   2.00   3.00   5.00   8.00 18687    1
y_pred[12]   3.36    0.01 1.85   0.00   2.00   3.00   4.00   7.00 19209    1
y_pred[13]   3.33    0.01 1.85   0.00   2.00   3.00   4.00   7.00 18975    1
y_pred[14]   3.31    0.01 1.84   0.00   2.00   3.00   4.00   7.00 19352    1
y_pred[15]   3.27    0.01 1.84   0.00   2.00   3.00   4.00   7.00 18457    1
y_pred[16]   3.19    0.01 1.81   0.00   2.00   3.00   4.00   7.00 19142    1
y_pred[17]   3.18    0.01 1.79   0.00   2.00   3.00   4.00   7.00 19935    1
y_pred[18]   3.13    0.01 1.79   0.00   2.00   3.00   4.00   7.00 19582    1
y_pred[19]   3.12    0.01 1.78   0.00   2.00   3.00   4.00   7.00 19378    1
y_pred[20]   3.04    0.01 1.76   0.00   2.00   3.00   4.00   7.00 19711    1
y_pred[21]   3.03    0.01 1.76   0.00   2.00   3.00   4.00   7.00 19947    1
y_pred[22]   3.03    0.01 1.76   0.00   2.00   3.00   4.00   7.00 18393    1
y_pred[23]   2.97    0.01 1.75   0.00   2.00   3.00   4.00   7.00 19852    1
y_pred[24]   2.94    0.01 1.72   0.00   2.00   3.00   4.00   7.00 19350    1
y_pred[25]   2.88    0.01 1.70   0.00   2.00   3.00   4.00   7.00 19221    1
y_pred[26]   2.86    0.01 1.72   0.00   2.00   3.00   4.00   7.00 19474    1
y_pred[27]   2.80    0.01 1.69   0.00   2.00   3.00   4.00   7.00 19692    1
y_pred[28]   2.79    0.01 1.69   0.00   2.00   3.00   4.00   7.00 19871    1
y_pred[29]   2.78    0.01 1.69   0.00   2.00   3.00   4.00   6.00 19545    1
y_pred[30]   2.74    0.01 1.67   0.00   2.00   3.00   4.00   6.00 19842    1
y_pred[31]   2.70    0.01 1.64   0.00   2.00   3.00   4.00   6.00 19931    1
y_pred[32]   2.66    0.01 1.63   0.00   1.00   2.00   4.00   6.00 19601    1
y_pred[33]   2.62    0.01 1.65   0.00   1.00   2.00   4.00   6.00 20094    1
y_pred[34]   2.62    0.01 1.62   0.00   1.00   2.00   4.00   6.00 19914    1
y_pred[35]   2.57    0.01 1.62   0.00   1.00   2.00   4.00   6.00 20320    1
y_pred[36]   2.55    0.01 1.60   0.00   1.00   2.00   4.00   6.00 19778    1
y_pred[37]   2.50    0.01 1.60   0.00   1.00   2.00   3.00   6.00 19571    1
y_pred[38]   2.49    0.01 1.59   0.00   1.00   2.00   3.00   6.00 19647    1
y_pred[39]   2.46    0.01 1.59   0.00   1.00   2.00   3.00   6.00 20073    1
y_pred[40]   2.41    0.01 1.56   0.00   1.00   2.00   3.00   6.00 19843    1
y_pred[41]   2.41    0.01 1.57   0.00   1.00   2.00   3.00   6.00 19759    1
y_pred[42]   2.37    0.01 1.55   0.00   1.00   2.00   3.00   6.00 19334    1
y_pred[43]   2.35    0.01 1.54   0.00   1.00   2.00   3.00   6.00 19994    1
y_pred[44]   2.26    0.01 1.50   0.00   1.00   2.00   3.00   6.00 20127    1
y_pred[45]   2.22    0.01 1.50   0.00   1.00   2.00   3.00   6.00 20102    1
y_pred[46]   2.19    0.01 1.48   0.00   1.00   2.00   3.00   5.00 19534    1
y_pred[47]   2.12    0.01 1.47   0.00   1.00   2.00   3.00   5.00 19176    1
y_pred[48]   2.10    0.01 1.46   0.00   1.00   2.00   3.00   5.00 18969    1
y_pred[49]   2.04    0.01 1.43   0.00   1.00   2.00   3.00   5.00 19767    1
y_pred[50]   2.04    0.01 1.44   0.00   1.00   2.00   3.00   5.00 19679    1
y_pred[51]   2.00    0.01 1.42   0.00   1.00   2.00   3.00   5.00 19898    1
y_pred[52]   1.95    0.01 1.41   0.00   1.00   2.00   3.00   5.00 19821    1
y_pred[53]   1.94    0.01 1.40   0.00   1.00   2.00   3.00   5.00 19609    1
y_pred[54]   1.89    0.01 1.38   0.00   1.00   2.00   3.00   5.00 18568    1
y_pred[55]   1.86    0.01 1.38   0.00   1.00   2.00   3.00   5.00 18921    1
y_pred[56]   1.80    0.01 1.35   0.00   1.00   2.00   3.00   5.00 18761    1
y_pred[57]   1.73    0.01 1.34   0.00   1.00   2.00   2.00   5.00 18917    1
y_pred[58]   1.69    0.01 1.31   0.00   1.00   1.50   2.00   5.00 19234    1
y_pred[59]   1.65    0.01 1.30   0.00   1.00   1.00   2.00   5.00 19298    1
y_pred[60]   1.64    0.01 1.29   0.00   1.00   1.00   2.00   5.00 17661    1
y_pred[61]   1.59    0.01 1.28   0.00   1.00   1.00   2.00   4.02 18041    1
y_pred[62]   1.59    0.01 1.27   0.00   1.00   1.00   2.00   4.00 19188    1
y_pred[63]   1.56    0.01 1.27   0.00   1.00   1.00   2.00   4.00 18311    1
y_pred[64]   1.55    0.01 1.26   0.00   1.00   1.00   2.00   4.00 18442    1
y_pred[65]   1.53    0.01 1.25   0.00   1.00   1.00   2.00   4.00 17284    1
y_pred[66]   1.50    0.01 1.24   0.00   1.00   1.00   2.00   4.00 18780    1
y_pred[67]   1.49    0.01 1.23   0.00   1.00   1.00   2.00   4.00 18713    1
y_pred[68]   1.48    0.01 1.23   0.00   1.00   1.00   2.00   4.00 17838    1
y_pred[69]   1.46    0.01 1.23   0.00   1.00   1.00   2.00   4.00 18734    1
y_pred[70]   1.45    0.01 1.21   0.00   1.00   1.00   2.00   4.00 17615    1
y_pred[71]   1.42    0.01 1.21   0.00   1.00   1.00   2.00   4.00 17283    1
y_pred[72]   1.40    0.01 1.20   0.00   0.00   1.00   2.00   4.00 17168    1
y_pred[73]   1.38    0.01 1.18   0.00   0.00   1.00   2.00   4.00 17201    1
y_pred[74]   1.34    0.01 1.18   0.00   0.00   1.00   2.00   4.00 17213    1
y_pred[75]   1.32    0.01 1.18   0.00   0.00   1.00   2.00   4.00 18293    1
y_pred[76]   1.26    0.01 1.14   0.00   0.00   1.00   2.00   4.00 17500    1
y_pred[77]   1.18    0.01 1.11   0.00   0.00   1.00   2.00   4.00 15720    1
y_pred[78]   1.14    0.01 1.09   0.00   0.00   1.00   2.00   4.00 17609    1
y_pred[79]   1.12    0.01 1.08   0.00   0.00   1.00   2.00   4.00 15874    1
lp__       -12.33    0.02 1.02 -15.09 -12.71 -12.03 -11.62 -11.35  3613    1

Samples were drawn using NUTS(diag_e) at Sun Mar 17 13:34:46 2024.
For each parameter, n_eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor on split chains (at 
convergence, Rhat=1).
summary(fit_muertes)
$summary
                   mean      se_mean          sd         2.5%          25%
alpha       22.79895517 9.609260e-02 4.817717905  13.70060123  19.46471316
beta        -0.01157645 5.088415e-05 0.002550714  -0.01671666  -0.01331459
y_pred[1]    3.98650000 1.579152e-02 2.037406514   1.00000000   3.00000000
y_pred[2]    3.92370000 1.643751e-02 2.047751932   1.00000000   2.00000000
y_pred[3]    3.87035000 1.485779e-02 2.027423587   1.00000000   2.00000000
y_pred[4]    3.84330000 1.508650e-02 2.005080080   1.00000000   2.00000000
y_pred[5]    3.74015000 1.507464e-02 1.993897379   0.00000000   2.00000000
y_pred[6]    3.70435000 1.421065e-02 1.956484722   0.00000000   2.00000000
y_pred[7]    3.65685000 1.439509e-02 1.956269262   0.00000000   2.00000000
y_pred[8]    3.57295000 1.449277e-02 1.918140313   0.00000000   2.00000000
y_pred[9]    3.55600000 1.453488e-02 1.915031949   0.00000000   2.00000000
y_pred[10]   3.48510000 1.357360e-02 1.894533571   0.00000000   2.00000000
y_pred[11]   3.45105000 1.374376e-02 1.878770978   0.00000000   2.00000000
y_pred[12]   3.35960000 1.335534e-02 1.850988695   0.00000000   2.00000000
y_pred[13]   3.32930000 1.345100e-02 1.852871600   0.00000000   2.00000000
y_pred[14]   3.30840000 1.323868e-02 1.841645738   0.00000000   2.00000000
y_pred[15]   3.26660000 1.353304e-02 1.838557439   0.00000000   2.00000000
y_pred[16]   3.19320000 1.308516e-02 1.810369474   0.00000000   2.00000000
y_pred[17]   3.17750000 1.270250e-02 1.793503438   0.00000000   2.00000000
y_pred[18]   3.12625000 1.279743e-02 1.790829777   0.00000000   2.00000000
y_pred[19]   3.11660000 1.280519e-02 1.782544056   0.00000000   2.00000000
y_pred[20]   3.03740000 1.254447e-02 1.761208770   0.00000000   2.00000000
y_pred[21]   3.03230000 1.247404e-02 1.761735478   0.00000000   2.00000000
y_pred[22]   3.03180000 1.296325e-02 1.758079436   0.00000000   2.00000000
y_pred[23]   2.97080000 1.239020e-02 1.745737592   0.00000000   2.00000000
y_pred[24]   2.93510000 1.240030e-02 1.724916449   0.00000000   2.00000000
y_pred[25]   2.88205000 1.227828e-02 1.702258113   0.00000000   2.00000000
y_pred[26]   2.86460000 1.235423e-02 1.724040445   0.00000000   2.00000000
y_pred[27]   2.79855000 1.207272e-02 1.694140314   0.00000000   2.00000000
y_pred[28]   2.78855000 1.197874e-02 1.688573795   0.00000000   2.00000000
y_pred[29]   2.78215000 1.205589e-02 1.685447541   0.00000000   2.00000000
y_pred[30]   2.74050000 1.187302e-02 1.672453170   0.00000000   2.00000000
y_pred[31]   2.69985000 1.162352e-02 1.640973680   0.00000000   2.00000000
y_pred[32]   2.66300000 1.167085e-02 1.633941397   0.00000000   1.00000000
y_pred[33]   2.62435000 1.160702e-02 1.645318337   0.00000000   1.00000000
y_pred[34]   2.61910000 1.150980e-02 1.624237389   0.00000000   1.00000000
y_pred[35]   2.57375000 1.133428e-02 1.615670591   0.00000000   1.00000000
y_pred[36]   2.54515000 1.138045e-02 1.600465417   0.00000000   1.00000000
y_pred[37]   2.50225000 1.141195e-02 1.596471852   0.00000000   1.00000000
y_pred[38]   2.48665000 1.135982e-02 1.592277785   0.00000000   1.00000000
y_pred[39]   2.46145000 1.118852e-02 1.585193849   0.00000000   1.00000000
y_pred[40]   2.41195000 1.108299e-02 1.561207567   0.00000000   1.00000000
y_pred[41]   2.40690000 1.115709e-02 1.568328847   0.00000000   1.00000000
y_pred[42]   2.37225000 1.113528e-02 1.548321608   0.00000000   1.00000000
y_pred[43]   2.34755000 1.092258e-02 1.544466985   0.00000000   1.00000000
y_pred[44]   2.26280000 1.056367e-02 1.498648877   0.00000000   1.00000000
y_pred[45]   2.21610000 1.061233e-02 1.504630847   0.00000000   1.00000000
y_pred[46]   2.18805000 1.055444e-02 1.475125756   0.00000000   1.00000000
y_pred[47]   2.12155000 1.059064e-02 1.466554853   0.00000000   1.00000000
y_pred[48]   2.10185000 1.059478e-02 1.459206305   0.00000000   1.00000000
y_pred[49]   2.03820000 1.020598e-02 1.434901985   0.00000000   1.00000000
y_pred[50]   2.04340000 1.024804e-02 1.437609048   0.00000000   1.00000000
y_pred[51]   1.99845000 1.008999e-02 1.423288054   0.00000000   1.00000000
y_pred[52]   1.94970000 9.995903e-03 1.407291346   0.00000000   1.00000000
y_pred[53]   1.93540000 1.002937e-02 1.404430654   0.00000000   1.00000000
y_pred[54]   1.89440000 1.016336e-02 1.384898746   0.00000000   1.00000000
y_pred[55]   1.85595000 1.002546e-02 1.379055723   0.00000000   1.00000000
y_pred[56]   1.80425000 9.871823e-03 1.352155078   0.00000000   1.00000000
y_pred[57]   1.73490000 9.762754e-03 1.342762876   0.00000000   1.00000000
y_pred[58]   1.68740000 9.470825e-03 1.313494387   0.00000000   1.00000000
y_pred[59]   1.65040000 9.378483e-03 1.302829501   0.00000000   1.00000000
y_pred[60]   1.63960000 9.741509e-03 1.294602503   0.00000000   1.00000000
y_pred[61]   1.58900000 9.526337e-03 1.279554947   0.00000000   1.00000000
y_pred[62]   1.58515000 9.183229e-03 1.272057540   0.00000000   1.00000000
y_pred[63]   1.55925000 9.371392e-03 1.268136367   0.00000000   1.00000000
y_pred[64]   1.54555000 9.257850e-03 1.257220835   0.00000000   1.00000000
y_pred[65]   1.53065000 9.500852e-03 1.249055077   0.00000000   1.00000000
y_pred[66]   1.50200000 9.084569e-03 1.244939153   0.00000000   1.00000000
y_pred[67]   1.48950000 9.013669e-03 1.233031130   0.00000000   1.00000000
y_pred[68]   1.47710000 9.245398e-03 1.234808417   0.00000000   1.00000000
y_pred[69]   1.45915000 8.980369e-03 1.229148818   0.00000000   1.00000000
y_pred[70]   1.45490000 9.141800e-03 1.213317599   0.00000000   1.00000000
y_pred[71]   1.41890000 9.229129e-03 1.213299796   0.00000000   1.00000000
y_pred[72]   1.39855000 9.189073e-03 1.204026736   0.00000000   0.00000000
y_pred[73]   1.37620000 9.016016e-03 1.182473455   0.00000000   0.00000000
y_pred[74]   1.34000000 9.001959e-03 1.181046038   0.00000000   0.00000000
y_pred[75]   1.31580000 8.699564e-03 1.176622107   0.00000000   0.00000000
y_pred[76]   1.25625000 8.652599e-03 1.144618471   0.00000000   0.00000000
y_pred[77]   1.17675000 8.831049e-03 1.107235628   0.00000000   0.00000000
y_pred[78]   1.13695000 8.177947e-03 1.085197484   0.00000000   0.00000000
y_pred[79]   1.11520000 8.590615e-03 1.082352777   0.00000000   0.00000000
lp__       -12.33227107 1.691460e-02 1.016768243 -15.09346922 -12.70708288
                    50%           75%         97.5%     n_eff      Rhat
alpha       22.72926984  26.082481677  32.491124480  2513.639 1.0007614
beta        -0.01153511  -0.009813334  -0.006749275  2512.803 1.0007666
y_pred[1]    4.00000000   5.000000000   8.000000000 16645.903 1.0000521
y_pred[2]    4.00000000   5.000000000   8.000000000 15519.680 0.9999331
y_pred[3]    4.00000000   5.000000000   8.000000000 18620.050 0.9998647
y_pred[4]    4.00000000   5.000000000   8.000000000 17663.885 1.0001507
y_pred[5]    4.00000000   5.000000000   8.000000000 17494.909 0.9999916
y_pred[6]    3.00000000   5.000000000   8.000000000 18955.055 1.0000450
y_pred[7]    3.00000000   5.000000000   8.000000000 18468.370 0.9999204
y_pred[8]    3.00000000   5.000000000   8.000000000 17516.934 0.9999414
y_pred[9]    3.00000000   5.000000000   8.000000000 17359.188 0.9999203
y_pred[10]   3.00000000   5.000000000   8.000000000 19481.161 1.0000067
y_pred[11]   3.00000000   5.000000000   8.000000000 18686.856 0.9999681
y_pred[12]   3.00000000   4.000000000   7.000000000 19208.694 0.9998940
y_pred[13]   3.00000000   4.000000000   7.000000000 18974.993 0.9999770
y_pred[14]   3.00000000   4.000000000   7.000000000 19351.870 0.9999754
y_pred[15]   3.00000000   4.000000000   7.000000000 18457.109 1.0001338
y_pred[16]   3.00000000   4.000000000   7.000000000 19141.520 1.0000919
y_pred[17]   3.00000000   4.000000000   7.000000000 19935.440 1.0000306
y_pred[18]   3.00000000   4.000000000   7.000000000 19582.283 1.0001373
y_pred[19]   3.00000000   4.000000000   7.000000000 19377.986 1.0000039
y_pred[20]   3.00000000   4.000000000   7.000000000 19711.372 1.0000026
y_pred[21]   3.00000000   4.000000000   7.000000000 19946.527 1.0000590
y_pred[22]   3.00000000   4.000000000   7.000000000 18392.864 0.9999834
y_pred[23]   3.00000000   4.000000000   7.000000000 19851.866 1.0000471
y_pred[24]   3.00000000   4.000000000   7.000000000 19349.594 1.0001104
y_pred[25]   3.00000000   4.000000000   7.000000000 19220.992 0.9998832
y_pred[26]   3.00000000   4.000000000   7.000000000 19474.388 1.0000829
y_pred[27]   3.00000000   4.000000000   7.000000000 19691.926 1.0000194
y_pred[28]   3.00000000   4.000000000   7.000000000 19870.908 1.0000199
y_pred[29]   3.00000000   4.000000000   6.000000000 19544.830 1.0000401
y_pred[30]   3.00000000   4.000000000   6.000000000 19842.017 0.9999271
y_pred[31]   3.00000000   4.000000000   6.000000000 19930.942 1.0001339
y_pred[32]   2.00000000   4.000000000   6.000000000 19600.538 0.9998955
y_pred[33]   2.00000000   4.000000000   6.000000000 20093.637 0.9998730
y_pred[34]   2.00000000   4.000000000   6.000000000 19914.232 0.9998751
y_pred[35]   2.00000000   4.000000000   6.000000000 20319.732 1.0000219
y_pred[36]   2.00000000   4.000000000   6.000000000 19777.616 0.9998565
y_pred[37]   2.00000000   3.000000000   6.000000000 19570.544 1.0000296
y_pred[38]   2.00000000   3.000000000   6.000000000 19646.948 1.0000768
y_pred[39]   2.00000000   3.000000000   6.000000000 20073.332 0.9999936
y_pred[40]   2.00000000   3.000000000   6.000000000 19843.009 0.9999982
y_pred[41]   2.00000000   3.000000000   6.000000000 19759.354 1.0000496
y_pred[42]   2.00000000   3.000000000   6.000000000 19333.925 0.9999527
y_pred[43]   2.00000000   3.000000000   6.000000000 19994.309 0.9998864
y_pred[44]   2.00000000   3.000000000   6.000000000 20126.592 1.0000554
y_pred[45]   2.00000000   3.000000000   6.000000000 20101.974 0.9998914
y_pred[46]   2.00000000   3.000000000   5.000000000 19533.837 1.0001169
y_pred[47]   2.00000000   3.000000000   5.000000000 19175.748 0.9998658
y_pred[48]   2.00000000   3.000000000   5.000000000 18969.228 1.0000032
y_pred[49]   2.00000000   3.000000000   5.000000000 19766.730 1.0000953
y_pred[50]   2.00000000   3.000000000   5.000000000 19678.879 1.0000193
y_pred[51]   2.00000000   3.000000000   5.000000000 19897.756 1.0001551
y_pred[52]   2.00000000   3.000000000   5.000000000 19820.926 1.0002192
y_pred[53]   2.00000000   3.000000000   5.000000000 19608.895 1.0000498
y_pred[54]   2.00000000   3.000000000   5.000000000 18567.829 1.0000710
y_pred[55]   2.00000000   3.000000000   5.000000000 18921.475 0.9999199
y_pred[56]   2.00000000   3.000000000   5.000000000 18761.101 1.0001549
y_pred[57]   2.00000000   2.000000000   5.000000000 18917.072 1.0000610
y_pred[58]   1.50000000   2.000000000   5.000000000 19234.495 1.0000527
y_pred[59]   1.00000000   2.000000000   5.000000000 19297.898 1.0004642
y_pred[60]   1.00000000   2.000000000   5.000000000 17661.209 1.0000289
y_pred[61]   1.00000000   2.000000000   4.025000000 18041.223 1.0002414
y_pred[62]   1.00000000   2.000000000   4.000000000 19187.688 0.9998948
y_pred[63]   1.00000000   2.000000000   4.000000000 18311.492 1.0000434
y_pred[64]   1.00000000   2.000000000   4.000000000 18441.782 1.0000560
y_pred[65]   1.00000000   2.000000000   4.000000000 17283.756 0.9999386
y_pred[66]   1.00000000   2.000000000   4.000000000 18779.654 0.9999644
y_pred[67]   1.00000000   2.000000000   4.000000000 18713.061 1.0001819
y_pred[68]   1.00000000   2.000000000   4.000000000 17838.073 1.0000011
y_pred[69]   1.00000000   2.000000000   4.000000000 18733.570 1.0000175
y_pred[70]   1.00000000   2.000000000   4.000000000 17615.119 0.9998457
y_pred[71]   1.00000000   2.000000000   4.000000000 17282.829 1.0002552
y_pred[72]   1.00000000   2.000000000   4.000000000 17168.360 0.9999690
y_pred[73]   1.00000000   2.000000000   4.000000000 17200.992 1.0001445
y_pred[74]   1.00000000   2.000000000   4.000000000 17213.121 1.0000215
y_pred[75]   1.00000000   2.000000000   4.000000000 18292.750 0.9999828
y_pred[76]   1.00000000   2.000000000   4.000000000 17499.609 1.0000296
y_pred[77]   1.00000000   2.000000000   4.000000000 15720.108 0.9998241
y_pred[78]   1.00000000   2.000000000   4.000000000 17608.768 0.9999903
y_pred[79]   1.00000000   2.000000000   4.000000000 15874.101 1.0001439
lp__       -12.02559630 -11.615723032 -11.347852077  3613.439 1.0007848

$c_summary
, , chains = chain:1

            stats
parameter            mean          sd         2.5%          25%          50%
  alpha       22.97985856 4.740681068  14.28002956  19.65351995  22.75931684
  beta        -0.01167162 0.002510009  -0.01703753  -0.01342365  -0.01155592
  y_pred[1]    3.96680000 2.040816196   1.00000000   3.00000000   4.00000000
  y_pred[2]    3.94100000 2.077397920   1.00000000   2.00000000   4.00000000
  y_pred[3]    3.89100000 2.038025935   1.00000000   2.00000000   4.00000000
  y_pred[4]    3.84720000 2.015605292   1.00000000   2.00000000   4.00000000
  y_pred[5]    3.76380000 1.996899367   0.00000000   2.00000000   4.00000000
  y_pred[6]    3.73740000 1.963061885   1.00000000   2.00000000   4.00000000
  y_pred[7]    3.68020000 1.947379372   0.00000000   2.00000000   4.00000000
  y_pred[8]    3.55920000 1.914583110   0.00000000   2.00000000   3.00000000
  y_pred[9]    3.56820000 1.918563248   0.00000000   2.00000000   3.00000000
  y_pred[10]   3.48940000 1.893516496   0.00000000   2.00000000   3.00000000
  y_pred[11]   3.46680000 1.875206933   0.00000000   2.00000000   3.00000000
  y_pred[12]   3.36960000 1.859163021   0.00000000   2.00000000   3.00000000
  y_pred[13]   3.34940000 1.845968895   0.00000000   2.00000000   3.00000000
  y_pred[14]   3.28400000 1.828992357   0.00000000   2.00000000   3.00000000
  y_pred[15]   3.25600000 1.855681199   0.00000000   2.00000000   3.00000000
  y_pred[16]   3.22740000 1.803091642   0.00000000   2.00000000   3.00000000
  y_pred[17]   3.20580000 1.767956870   0.00000000   2.00000000   3.00000000
  y_pred[18]   3.19020000 1.796181843   0.00000000   2.00000000   3.00000000
  y_pred[19]   3.08920000 1.771177793   0.00000000   2.00000000   3.00000000
  y_pred[20]   3.06000000 1.787691016   0.00000000   2.00000000   3.00000000
  y_pred[21]   3.07420000 1.750687677   0.00000000   2.00000000   3.00000000
  y_pred[22]   3.03980000 1.770322787   0.00000000   2.00000000   3.00000000
  y_pred[23]   2.95460000 1.725785186   0.00000000   2.00000000   3.00000000
  y_pred[24]   2.90120000 1.715467028   0.00000000   2.00000000   3.00000000
  y_pred[25]   2.88840000 1.702388048   0.00000000   2.00000000   3.00000000
  y_pred[26]   2.90520000 1.709385082   0.00000000   2.00000000   3.00000000
  y_pred[27]   2.79580000 1.679960361   0.00000000   2.00000000   3.00000000
  y_pred[28]   2.79060000 1.700155800   0.00000000   2.00000000   3.00000000
  y_pred[29]   2.80860000 1.681467070   0.00000000   2.00000000   3.00000000
  y_pred[30]   2.74500000 1.663889631   0.00000000   2.00000000   3.00000000
  y_pred[31]   2.72640000 1.640512510   0.00000000   2.00000000   3.00000000
  y_pred[32]   2.65020000 1.623442969   0.00000000   1.00000000   2.00000000
  y_pred[33]   2.63280000 1.638200507   0.00000000   1.00000000   2.00000000
  y_pred[34]   2.60160000 1.601183999   0.00000000   1.00000000   2.00000000
  y_pred[35]   2.57880000 1.634235205   0.00000000   1.00000000   2.00000000
  y_pred[36]   2.54920000 1.590121083   0.00000000   1.00000000   2.00000000
  y_pred[37]   2.48700000 1.592211678   0.00000000   1.00000000   2.00000000
  y_pred[38]   2.50800000 1.602763167   0.00000000   1.00000000   2.00000000
  y_pred[39]   2.49720000 1.590376691   0.00000000   1.00000000   2.00000000
  y_pred[40]   2.42800000 1.566175782   0.00000000   1.00000000   2.00000000
  y_pred[41]   2.42980000 1.577456698   0.00000000   1.00000000   2.00000000
  y_pred[42]   2.35440000 1.528289166   0.00000000   1.00000000   2.00000000
  y_pred[43]   2.37240000 1.537722718   0.00000000   1.00000000   2.00000000
  y_pred[44]   2.23400000 1.488384042   0.00000000   1.00000000   2.00000000
  y_pred[45]   2.22220000 1.516076138   0.00000000   1.00000000   2.00000000
  y_pred[46]   2.14920000 1.458000176   0.00000000   1.00000000   2.00000000
  y_pred[47]   2.11620000 1.480789015   0.00000000   1.00000000   2.00000000
  y_pred[48]   2.11200000 1.473326216   0.00000000   1.00000000   2.00000000
  y_pred[49]   2.03540000 1.412425513   0.00000000   1.00000000   2.00000000
  y_pred[50]   2.04580000 1.410992750   0.00000000   1.00000000   2.00000000
  y_pred[51]   1.98020000 1.435764690   0.00000000   1.00000000   2.00000000
  y_pred[52]   1.97940000 1.418653642   0.00000000   1.00000000   2.00000000
  y_pred[53]   1.90680000 1.387695536   0.00000000   1.00000000   2.00000000
  y_pred[54]   1.88240000 1.369578544   0.00000000   1.00000000   2.00000000
  y_pred[55]   1.84640000 1.381444432   0.00000000   1.00000000   2.00000000
  y_pred[56]   1.81180000 1.349498051   0.00000000   1.00000000   2.00000000
  y_pred[57]   1.72240000 1.350445475   0.00000000   1.00000000   2.00000000
  y_pred[58]   1.66400000 1.307916713   0.00000000   1.00000000   1.00000000
  y_pred[59]   1.62860000 1.282416063   0.00000000   1.00000000   1.00000000
  y_pred[60]   1.66140000 1.298571252   0.00000000   1.00000000   1.00000000
  y_pred[61]   1.58440000 1.282811662   0.00000000   1.00000000   1.00000000
  y_pred[62]   1.58320000 1.278203630   0.00000000   1.00000000   1.00000000
  y_pred[63]   1.54920000 1.248795923   0.00000000   1.00000000   1.00000000
  y_pred[64]   1.52040000 1.240278799   0.00000000   1.00000000   1.00000000
  y_pred[65]   1.54760000 1.270140501   0.00000000   1.00000000   1.00000000
  y_pred[66]   1.50340000 1.242697590   0.00000000   1.00000000   1.00000000
  y_pred[67]   1.52820000 1.233332470   0.00000000   1.00000000   1.00000000
  y_pred[68]   1.46140000 1.225891758   0.00000000   1.00000000   1.00000000
  y_pred[69]   1.43480000 1.225744447   0.00000000   1.00000000   1.00000000
  y_pred[70]   1.44840000 1.203838563   0.00000000   1.00000000   1.00000000
  y_pred[71]   1.42300000 1.209778373   0.00000000   1.00000000   1.00000000
  y_pred[72]   1.38820000 1.212185893   0.00000000   0.00000000   1.00000000
  y_pred[73]   1.34760000 1.161827960   0.00000000   0.00000000   1.00000000
  y_pred[74]   1.34740000 1.188526718   0.00000000   0.00000000   1.00000000
  y_pred[75]   1.32400000 1.180890723   0.00000000   0.00000000   1.00000000
  y_pred[76]   1.23140000 1.128764307   0.00000000   0.00000000   1.00000000
  y_pred[77]   1.17680000 1.078598366   0.00000000   0.00000000   1.00000000
  y_pred[78]   1.13420000 1.093814265   0.00000000   0.00000000   1.00000000
  y_pred[79]   1.09400000 1.052703965   0.00000000   0.00000000   1.00000000
  lp__       -12.31977386 0.978067911 -14.91043681 -12.70612048 -12.01334101
            stats
parameter              75%        97.5%
  alpha       26.285745065  33.07249683
  beta        -0.009906864  -0.00703935
  y_pred[1]    5.000000000   8.00000000
  y_pred[2]    5.000000000   8.00000000
  y_pred[3]    5.000000000   8.00000000
  y_pred[4]    5.000000000   8.00000000
  y_pred[5]    5.000000000   8.00000000
  y_pred[6]    5.000000000   8.00000000
  y_pred[7]    5.000000000   8.00000000
  y_pred[8]    5.000000000   8.00000000
  y_pred[9]    5.000000000   8.00000000
  y_pred[10]   5.000000000   8.00000000
  y_pred[11]   5.000000000   8.00000000
  y_pred[12]   4.000000000   7.00000000
  y_pred[13]   5.000000000   7.00000000
  y_pred[14]   4.000000000   7.00000000
  y_pred[15]   4.000000000   7.00000000
  y_pred[16]   4.000000000   7.00000000
  y_pred[17]   4.000000000   7.00000000
  y_pred[18]   4.000000000   7.00000000
  y_pred[19]   4.000000000   7.00000000
  y_pred[20]   4.000000000   7.00000000
  y_pred[21]   4.000000000   7.00000000
  y_pred[22]   4.000000000   7.00000000
  y_pred[23]   4.000000000   7.00000000
  y_pred[24]   4.000000000   7.00000000
  y_pred[25]   4.000000000   7.00000000
  y_pred[26]   4.000000000   7.00000000
  y_pred[27]   4.000000000   7.00000000
  y_pred[28]   4.000000000   7.00000000
  y_pred[29]   4.000000000   6.00000000
  y_pred[30]   4.000000000   6.00000000
  y_pred[31]   4.000000000   6.00000000
  y_pred[32]   4.000000000   6.00000000
  y_pred[33]   4.000000000   6.00000000
  y_pred[34]   4.000000000   6.00000000
  y_pred[35]   4.000000000   6.00000000
  y_pred[36]   3.000000000   6.00000000
  y_pred[37]   3.000000000   6.00000000
  y_pred[38]   3.000000000   6.00000000
  y_pred[39]   3.000000000   6.00000000
  y_pred[40]   3.000000000   6.00000000
  y_pred[41]   3.000000000   6.00000000
  y_pred[42]   3.000000000   6.00000000
  y_pred[43]   3.000000000   6.00000000
  y_pred[44]   3.000000000   5.00000000
  y_pred[45]   3.000000000   6.00000000
  y_pred[46]   3.000000000   5.00000000
  y_pred[47]   3.000000000   5.00000000
  y_pred[48]   3.000000000   5.00000000
  y_pred[49]   3.000000000   5.00000000
  y_pred[50]   3.000000000   5.00000000
  y_pred[51]   3.000000000   5.00000000
  y_pred[52]   3.000000000   5.00000000
  y_pred[53]   3.000000000   5.00000000
  y_pred[54]   3.000000000   5.00000000
  y_pred[55]   3.000000000   5.00000000
  y_pred[56]   3.000000000   5.00000000
  y_pred[57]   2.000000000   5.00000000
  y_pred[58]   2.000000000   5.00000000
  y_pred[59]   2.000000000   5.00000000
  y_pred[60]   2.000000000   5.00000000
  y_pred[61]   2.000000000   4.00000000
  y_pred[62]   2.000000000   4.00000000
  y_pred[63]   2.000000000   4.00000000
  y_pred[64]   2.000000000   4.00000000
  y_pred[65]   2.000000000   4.00000000
  y_pred[66]   2.000000000   4.00000000
  y_pred[67]   2.000000000   4.00000000
  y_pred[68]   2.000000000   4.00000000
  y_pred[69]   2.000000000   4.00000000
  y_pred[70]   2.000000000   4.00000000
  y_pred[71]   2.000000000   4.00000000
  y_pred[72]   2.000000000   4.00000000
  y_pred[73]   2.000000000   4.00000000
  y_pred[74]   2.000000000   4.00000000
  y_pred[75]   2.000000000   4.00000000
  y_pred[76]   2.000000000   4.00000000
  y_pred[77]   2.000000000   4.00000000
  y_pred[78]   2.000000000   4.00000000
  y_pred[79]   2.000000000   4.00000000
  lp__       -11.613423519 -11.34407625

, , chains = chain:2

            stats
parameter            mean          sd         2.5%          25%          50%
  alpha       22.79373662 4.837741182  13.72158594  19.46978705  22.74845418
  beta        -0.01157397 0.002561543  -0.01667434  -0.01334995  -0.01155324
  y_pred[1]    4.04360000 2.044342170   1.00000000   3.00000000   4.00000000
  y_pred[2]    3.88600000 2.046128375   0.00000000   2.00000000   4.00000000
  y_pred[3]    3.84140000 2.001761087   1.00000000   2.00000000   4.00000000
  y_pred[4]    3.85120000 1.978747495   1.00000000   2.00000000   4.00000000
  y_pred[5]    3.76000000 1.995594267   0.00000000   2.00000000   4.00000000
  y_pred[6]    3.67160000 1.939099136   0.00000000   2.00000000   3.00000000
  y_pred[7]    3.62160000 1.970479637   0.00000000   2.00000000   3.00000000
  y_pred[8]    3.55320000 1.911517863   0.00000000   2.00000000   3.00000000
  y_pred[9]    3.55600000 1.920833652   0.00000000   2.00000000   3.00000000
  y_pred[10]   3.46880000 1.906345559   0.00000000   2.00000000   3.00000000
  y_pred[11]   3.44280000 1.908469704   0.00000000   2.00000000   3.00000000
  y_pred[12]   3.36180000 1.866118228   0.00000000   2.00000000   3.00000000
  y_pred[13]   3.36500000 1.886925303   0.00000000   2.00000000   3.00000000
  y_pred[14]   3.30660000 1.843387115   0.00000000   2.00000000   3.00000000
  y_pred[15]   3.32100000 1.850146915   0.00000000   2.00000000   3.00000000
  y_pred[16]   3.21880000 1.814823759   0.00000000   2.00000000   3.00000000
  y_pred[17]   3.16540000 1.805628671   0.00000000   2.00000000   3.00000000
  y_pred[18]   3.12300000 1.797697790   0.00000000   2.00000000   3.00000000
  y_pred[19]   3.09460000 1.787929019   0.00000000   2.00000000   3.00000000
  y_pred[20]   3.05880000 1.753042267   0.00000000   2.00000000   3.00000000
  y_pred[21]   3.00640000 1.760164389   0.00000000   2.00000000   3.00000000
  y_pred[22]   3.02760000 1.755521179   0.00000000   2.00000000   3.00000000
  y_pred[23]   2.99920000 1.748316528   0.00000000   2.00000000   3.00000000
  y_pred[24]   2.95540000 1.737243346   0.00000000   2.00000000   3.00000000
  y_pred[25]   2.88380000 1.687266112   0.00000000   2.00000000   3.00000000
  y_pred[26]   2.88020000 1.750560154   0.00000000   2.00000000   3.00000000
  y_pred[27]   2.81540000 1.710762456   0.00000000   2.00000000   3.00000000
  y_pred[28]   2.79380000 1.680430400   0.00000000   2.00000000   3.00000000
  y_pred[29]   2.78880000 1.701520965   0.00000000   2.00000000   3.00000000
  y_pred[30]   2.74360000 1.681078298   0.00000000   1.00000000   3.00000000
  y_pred[31]   2.66420000 1.640480599   0.00000000   1.00000000   2.50000000
  y_pred[32]   2.65740000 1.667867380   0.00000000   1.00000000   2.00000000
  y_pred[33]   2.62400000 1.644860211   0.00000000   1.00000000   2.00000000
  y_pred[34]   2.61460000 1.646392712   0.00000000   1.00000000   2.00000000
  y_pred[35]   2.60420000 1.632076927   0.00000000   1.00000000   2.00000000
  y_pred[36]   2.55600000 1.606169342   0.00000000   1.00000000   2.00000000
  y_pred[37]   2.51940000 1.582189718   0.00000000   1.00000000   2.00000000
  y_pred[38]   2.49720000 1.586472672   0.00000000   1.00000000   2.00000000
  y_pred[39]   2.45260000 1.565708635   0.00000000   1.00000000   2.00000000
  y_pred[40]   2.39720000 1.552003190   0.00000000   1.00000000   2.00000000
  y_pred[41]   2.39140000 1.557013455   0.00000000   1.00000000   2.00000000
  y_pred[42]   2.35140000 1.551257335   0.00000000   1.00000000   2.00000000
  y_pred[43]   2.34180000 1.558672080   0.00000000   1.00000000   2.00000000
  y_pred[44]   2.25100000 1.488368922   0.00000000   1.00000000   2.00000000
  y_pred[45]   2.23060000 1.501690599   0.00000000   1.00000000   2.00000000
  y_pred[46]   2.21240000 1.487322650   0.00000000   1.00000000   2.00000000
  y_pred[47]   2.10640000 1.478755011   0.00000000   1.00000000   2.00000000
  y_pred[48]   2.12380000 1.477061238   0.00000000   1.00000000   2.00000000
  y_pred[49]   2.07960000 1.448338143   0.00000000   1.00000000   2.00000000
  y_pred[50]   2.01880000 1.424096967   0.00000000   1.00000000   2.00000000
  y_pred[51]   1.95980000 1.389593521   0.00000000   1.00000000   2.00000000
  y_pred[52]   1.97020000 1.412554786   0.00000000   1.00000000   2.00000000
  y_pred[53]   1.95960000 1.414060732   0.00000000   1.00000000   2.00000000
  y_pred[54]   1.89420000 1.385564982   0.00000000   1.00000000   2.00000000
  y_pred[55]   1.84260000 1.366967066   0.00000000   1.00000000   2.00000000
  y_pred[56]   1.81860000 1.366626347   0.00000000   1.00000000   2.00000000
  y_pred[57]   1.77880000 1.347231816   0.00000000   1.00000000   2.00000000
  y_pred[58]   1.68040000 1.317498767   0.00000000   1.00000000   1.00000000
  y_pred[59]   1.63760000 1.304686430   0.00000000   1.00000000   1.00000000
  y_pred[60]   1.64480000 1.294205530   0.00000000   1.00000000   1.00000000
  y_pred[61]   1.57300000 1.267198547   0.00000000   1.00000000   1.00000000
  y_pred[62]   1.59280000 1.270948946   0.00000000   1.00000000   1.00000000
  y_pred[63]   1.56320000 1.272764607   0.00000000   1.00000000   1.00000000
  y_pred[64]   1.53980000 1.252808791   0.00000000   1.00000000   1.00000000
  y_pred[65]   1.50240000 1.246878174   0.00000000   1.00000000   1.00000000
  y_pred[66]   1.48540000 1.235189046   0.00000000   1.00000000   1.00000000
  y_pred[67]   1.44880000 1.221833514   0.00000000   1.00000000   1.00000000
  y_pred[68]   1.51160000 1.237324385   0.00000000   1.00000000   1.00000000
  y_pred[69]   1.46200000 1.227704138   0.00000000   1.00000000   1.00000000
  y_pred[70]   1.46560000 1.208928841   0.00000000   1.00000000   1.00000000
  y_pred[71]   1.41920000 1.231330417   0.00000000   0.00000000   1.00000000
  y_pred[72]   1.39540000 1.192033149   0.00000000   0.00000000   1.00000000
  y_pred[73]   1.39140000 1.194860485   0.00000000   0.00000000   1.00000000
  y_pred[74]   1.32520000 1.188834566   0.00000000   0.00000000   1.00000000
  y_pred[75]   1.30620000 1.172653650   0.00000000   0.00000000   1.00000000
  y_pred[76]   1.24960000 1.147154319   0.00000000   0.00000000   1.00000000
  y_pred[77]   1.17400000 1.119810160   0.00000000   0.00000000   1.00000000
  y_pred[78]   1.11400000 1.080387638   0.00000000   0.00000000   1.00000000
  y_pred[79]   1.11140000 1.084262498   0.00000000   0.00000000   1.00000000
  lp__       -12.32825424 1.025013831 -15.08590674 -12.69421523 -12.03784797
            stats
parameter              75%         97.5%
  alpha       26.142847995  32.460796314
  beta        -0.009815258  -0.006769324
  y_pred[1]    5.000000000   9.000000000
  y_pred[2]    5.000000000   8.000000000
  y_pred[3]    5.000000000   8.000000000
  y_pred[4]    5.000000000   8.000000000
  y_pred[5]    5.000000000   8.000000000
  y_pred[6]    5.000000000   8.000000000
  y_pred[7]    5.000000000   8.000000000
  y_pred[8]    5.000000000   8.000000000
  y_pred[9]    5.000000000   8.000000000
  y_pred[10]   5.000000000   8.000000000
  y_pred[11]   5.000000000   8.000000000
  y_pred[12]   4.000000000   7.000000000
  y_pred[13]   5.000000000   8.000000000
  y_pred[14]   4.000000000   7.000000000
  y_pred[15]   4.000000000   7.000000000
  y_pred[16]   4.000000000   7.000000000
  y_pred[17]   4.000000000   7.000000000
  y_pred[18]   4.000000000   7.000000000
  y_pred[19]   4.000000000   7.000000000
  y_pred[20]   4.000000000   7.000000000
  y_pred[21]   4.000000000   7.000000000
  y_pred[22]   4.000000000   7.000000000
  y_pred[23]   4.000000000   7.000000000
  y_pred[24]   4.000000000   7.000000000
  y_pred[25]   4.000000000   7.000000000
  y_pred[26]   4.000000000   7.000000000
  y_pred[27]   4.000000000   7.000000000
  y_pred[28]   4.000000000   6.000000000
  y_pred[29]   4.000000000   7.000000000
  y_pred[30]   4.000000000   7.000000000
  y_pred[31]   4.000000000   6.000000000
  y_pred[32]   4.000000000   6.000000000
  y_pred[33]   4.000000000   6.000000000
  y_pred[34]   4.000000000   6.000000000
  y_pred[35]   4.000000000   6.000000000
  y_pred[36]   3.000000000   6.000000000
  y_pred[37]   3.000000000   6.000000000
  y_pred[38]   3.000000000   6.000000000
  y_pred[39]   3.000000000   6.000000000
  y_pred[40]   3.000000000   6.000000000
  y_pred[41]   3.000000000   6.000000000
  y_pred[42]   3.000000000   6.000000000
  y_pred[43]   3.000000000   6.000000000
  y_pred[44]   3.000000000   6.000000000
  y_pred[45]   3.000000000   6.000000000
  y_pred[46]   3.000000000   5.000000000
  y_pred[47]   3.000000000   5.000000000
  y_pred[48]   3.000000000   5.000000000
  y_pred[49]   3.000000000   5.000000000
  y_pred[50]   3.000000000   5.000000000
  y_pred[51]   3.000000000   5.000000000
  y_pred[52]   3.000000000   5.000000000
  y_pred[53]   3.000000000   5.000000000
  y_pred[54]   3.000000000   5.000000000
  y_pred[55]   3.000000000   5.000000000
  y_pred[56]   3.000000000   5.000000000
  y_pred[57]   3.000000000   5.000000000
  y_pred[58]   2.000000000   5.000000000
  y_pred[59]   2.000000000   5.000000000
  y_pred[60]   2.000000000   5.000000000
  y_pred[61]   2.000000000   4.000000000
  y_pred[62]   2.000000000   5.000000000
  y_pred[63]   2.000000000   4.000000000
  y_pred[64]   2.000000000   4.000000000
  y_pred[65]   2.000000000   4.000000000
  y_pred[66]   2.000000000   4.000000000
  y_pred[67]   2.000000000   4.000000000
  y_pred[68]   2.000000000   4.000000000
  y_pred[69]   2.000000000   4.000000000
  y_pred[70]   2.000000000   4.000000000
  y_pred[71]   2.000000000   4.000000000
  y_pred[72]   2.000000000   4.000000000
  y_pred[73]   2.000000000   4.000000000
  y_pred[74]   2.000000000   4.000000000
  y_pred[75]   2.000000000   4.000000000
  y_pred[76]   2.000000000   4.000000000
  y_pred[77]   2.000000000   4.000000000
  y_pred[78]   2.000000000   4.000000000
  y_pred[79]   2.000000000   4.000000000
  lp__       -11.621082136 -11.353400918

, , chains = chain:3

            stats
parameter            mean          sd        2.5%          25%          50%
  alpha       22.61805360 4.998767786  12.6612421  18.94296808  22.61231261
  beta        -0.01147999 0.002645437  -0.0166679  -0.01329615  -0.01148662
  y_pred[1]    3.94780000 2.046732269   1.0000000   2.00000000   4.00000000
  y_pred[2]    3.91240000 2.062129170   1.0000000   2.00000000   4.00000000
  y_pred[3]    3.87160000 2.043415902   0.0000000   2.00000000   4.00000000
  y_pred[4]    3.85360000 1.988607038   1.0000000   2.00000000   4.00000000
  y_pred[5]    3.72520000 1.984659350   1.0000000   2.00000000   4.00000000
  y_pred[6]    3.70480000 1.942887522   1.0000000   2.00000000   4.00000000
  y_pred[7]    3.67840000 1.958198240   0.0000000   2.00000000   3.00000000
  y_pred[8]    3.57440000 1.903152395   0.0000000   2.00000000   3.00000000
  y_pred[9]    3.55360000 1.892575866   0.0000000   2.00000000   3.00000000
  y_pred[10]   3.47120000 1.879115956   0.0000000   2.00000000   3.00000000
  y_pred[11]   3.41400000 1.881943767   0.0000000   2.00000000   3.00000000
  y_pred[12]   3.34660000 1.833995951   0.0000000   2.00000000   3.00000000
  y_pred[13]   3.29680000 1.834716053   0.0000000   2.00000000   3.00000000
  y_pred[14]   3.31160000 1.841516678   0.0000000   2.00000000   3.00000000
  y_pred[15]   3.27240000 1.846965212   0.0000000   2.00000000   3.00000000
  y_pred[16]   3.14900000 1.807166891   0.0000000   2.00000000   3.00000000
  y_pred[17]   3.14580000 1.816480740   0.0000000   2.00000000   3.00000000
  y_pred[18]   3.08880000 1.772326942   0.0000000   2.00000000   3.00000000
  y_pred[19]   3.12300000 1.800255311   0.0000000   2.00000000   3.00000000
  y_pred[20]   3.04380000 1.758778047   0.0000000   2.00000000   3.00000000
  y_pred[21]   3.05120000 1.758976225   0.0000000   2.00000000   3.00000000
  y_pred[22]   3.01420000 1.756193384   0.0000000   2.00000000   3.00000000
  y_pred[23]   2.93120000 1.745931332   0.0000000   2.00000000   3.00000000
  y_pred[24]   2.95820000 1.726455586   0.0000000   2.00000000   3.00000000
  y_pred[25]   2.87000000 1.693656900   0.0000000   2.00000000   3.00000000
  y_pred[26]   2.85480000 1.709239966   0.0000000   2.00000000   3.00000000
  y_pred[27]   2.82160000 1.708905355   0.0000000   2.00000000   3.00000000
  y_pred[28]   2.79360000 1.694453680   0.0000000   2.00000000   3.00000000
  y_pred[29]   2.78880000 1.701520965   0.0000000   2.00000000   3.00000000
  y_pred[30]   2.73560000 1.676858674   0.0000000   2.00000000   3.00000000
  y_pred[31]   2.71280000 1.653500219   0.0000000   2.00000000   3.00000000
  y_pred[32]   2.67920000 1.631937500   0.0000000   1.00000000   3.00000000
  y_pred[33]   2.61000000 1.650468058   0.0000000   1.00000000   2.00000000
  y_pred[34]   2.63100000 1.609769322   0.0000000   1.00000000   2.00000000
  y_pred[35]   2.57880000 1.605835079   0.0000000   1.00000000   2.00000000
  y_pred[36]   2.55100000 1.606958451   0.0000000   1.00000000   2.00000000
  y_pred[37]   2.49640000 1.603278248   0.0000000   1.00000000   2.00000000
  y_pred[38]   2.48960000 1.592230787   0.0000000   1.00000000   2.00000000
  y_pred[39]   2.47140000 1.618179762   0.0000000   1.00000000   2.00000000
  y_pred[40]   2.38700000 1.560678745   0.0000000   1.00000000   2.00000000
  y_pred[41]   2.43900000 1.585301749   0.0000000   1.00000000   2.00000000
  y_pred[42]   2.40360000 1.569968153   0.0000000   1.00000000   2.00000000
  y_pred[43]   2.33820000 1.554961202   0.0000000   1.00000000   2.00000000
  y_pred[44]   2.27500000 1.503272086   0.0000000   1.00000000   2.00000000
  y_pred[45]   2.21380000 1.486651131   0.0000000   1.00000000   2.00000000
  y_pred[46]   2.21440000 1.466445612   0.0000000   1.00000000   2.00000000
  y_pred[47]   2.12460000 1.456536694   0.0000000   1.00000000   2.00000000
  y_pred[48]   2.08380000 1.445266522   0.0000000   1.00000000   2.00000000
  y_pred[49]   2.01440000 1.447415504   0.0000000   1.00000000   2.00000000
  y_pred[50]   2.07700000 1.454198727   0.0000000   1.00000000   2.00000000
  y_pred[51]   2.05380000 1.443648982   0.0000000   1.00000000   2.00000000
  y_pred[52]   1.95000000 1.418556471   0.0000000   1.00000000   2.00000000
  y_pred[53]   1.93100000 1.398223875   0.0000000   1.00000000   2.00000000
  y_pred[54]   1.90960000 1.389321375   0.0000000   1.00000000   2.00000000
  y_pred[55]   1.87620000 1.402022358   0.0000000   1.00000000   2.00000000
  y_pred[56]   1.78980000 1.349955716   0.0000000   1.00000000   2.00000000
  y_pred[57]   1.71900000 1.332514210   0.0000000   1.00000000   2.00000000
  y_pred[58]   1.72580000 1.307270497   0.0000000   1.00000000   2.00000000
  y_pred[59]   1.70140000 1.327173809   0.0000000   1.00000000   2.00000000
  y_pred[60]   1.65240000 1.309319328   0.0000000   1.00000000   1.00000000
  y_pred[61]   1.60140000 1.290446082   0.0000000   1.00000000   1.00000000
  y_pred[62]   1.57200000 1.273868340   0.0000000   1.00000000   1.00000000
  y_pred[63]   1.56900000 1.293048102   0.0000000   1.00000000   1.00000000
  y_pred[64]   1.57340000 1.269068378   0.0000000   1.00000000   1.00000000
  y_pred[65]   1.53460000 1.228456210   0.0000000   1.00000000   1.00000000
  y_pred[66]   1.49780000 1.248481839   0.0000000   1.00000000   1.00000000
  y_pred[67]   1.49500000 1.237206990   0.0000000   1.00000000   1.00000000
  y_pred[68]   1.47460000 1.224113774   0.0000000   1.00000000   1.00000000
  y_pred[69]   1.47960000 1.224370719   0.0000000   1.00000000   1.00000000
  y_pred[70]   1.45060000 1.225014193   0.0000000   1.00000000   1.00000000
  y_pred[71]   1.40640000 1.191773093   0.0000000   1.00000000   1.00000000
  y_pred[72]   1.38340000 1.203783311   0.0000000   0.00000000   1.00000000
  y_pred[73]   1.39120000 1.186273159   0.0000000   0.00000000   1.00000000
  y_pred[74]   1.32960000 1.176282521   0.0000000   0.00000000   1.00000000
  y_pred[75]   1.31800000 1.174202602   0.0000000   0.00000000   1.00000000
  y_pred[76]   1.26640000 1.148779778   0.0000000   0.00000000   1.00000000
  y_pred[77]   1.18300000 1.100069557   0.0000000   0.00000000   1.00000000
  y_pred[78]   1.16080000 1.091779170   0.0000000   0.00000000   1.00000000
  y_pred[79]   1.13520000 1.095427292   0.0000000   0.00000000   1.00000000
  lp__       -12.36443078 1.033548407 -15.1766098 -12.77490309 -12.04163662
            stats
parameter              75%         97.5%
  alpha       26.056603972  32.441814713
  beta        -0.009531957  -0.006216254
  y_pred[1]    5.000000000   9.000000000
  y_pred[2]    5.000000000   8.000000000
  y_pred[3]    5.000000000   8.000000000
  y_pred[4]    5.000000000   8.000000000
  y_pred[5]    5.000000000   8.000000000
  y_pred[6]    5.000000000   8.000000000
  y_pred[7]    5.000000000   8.000000000
  y_pred[8]    5.000000000   8.000000000
  y_pred[9]    5.000000000   8.000000000
  y_pred[10]   5.000000000   8.000000000
  y_pred[11]   5.000000000   8.000000000
  y_pred[12]   4.000000000   7.000000000
  y_pred[13]   4.000000000   7.000000000
  y_pred[14]   4.000000000   7.000000000
  y_pred[15]   4.000000000   7.000000000
  y_pred[16]   4.000000000   7.000000000
  y_pred[17]   4.000000000   7.000000000
  y_pred[18]   4.000000000   7.000000000
  y_pred[19]   4.000000000   7.000000000
  y_pred[20]   4.000000000   7.000000000
  y_pred[21]   4.000000000   7.000000000
  y_pred[22]   4.000000000   7.000000000
  y_pred[23]   4.000000000   7.000000000
  y_pred[24]   4.000000000   7.000000000
  y_pred[25]   4.000000000   7.000000000
  y_pred[26]   4.000000000   7.000000000
  y_pred[27]   4.000000000   7.000000000
  y_pred[28]   4.000000000   7.000000000
  y_pred[29]   4.000000000   6.025000000
  y_pred[30]   4.000000000   6.000000000
  y_pred[31]   4.000000000   6.000000000
  y_pred[32]   4.000000000   6.000000000
  y_pred[33]   4.000000000   6.000000000
  y_pred[34]   4.000000000   6.000000000
  y_pred[35]   4.000000000   6.000000000
  y_pred[36]   4.000000000   6.000000000
  y_pred[37]   3.000000000   6.000000000
  y_pred[38]   3.000000000   6.000000000
  y_pred[39]   3.000000000   6.000000000
  y_pred[40]   3.000000000   6.000000000
  y_pred[41]   3.000000000   6.000000000
  y_pred[42]   3.000000000   6.000000000
  y_pred[43]   3.000000000   6.000000000
  y_pred[44]   3.000000000   6.000000000
  y_pred[45]   3.000000000   6.000000000
  y_pred[46]   3.000000000   5.000000000
  y_pred[47]   3.000000000   5.000000000
  y_pred[48]   3.000000000   5.000000000
  y_pred[49]   3.000000000   5.000000000
  y_pred[50]   3.000000000   5.000000000
  y_pred[51]   3.000000000   5.000000000
  y_pred[52]   3.000000000   5.000000000
  y_pred[53]   3.000000000   5.000000000
  y_pred[54]   3.000000000   5.000000000
  y_pred[55]   3.000000000   5.000000000
  y_pred[56]   3.000000000   5.000000000
  y_pred[57]   2.000000000   5.000000000
  y_pred[58]   2.000000000   5.000000000
  y_pred[59]   2.000000000   5.000000000
  y_pred[60]   2.000000000   5.000000000
  y_pred[61]   2.000000000   5.000000000
  y_pred[62]   2.000000000   4.000000000
  y_pred[63]   2.000000000   5.000000000
  y_pred[64]   2.000000000   4.000000000
  y_pred[65]   2.000000000   4.000000000
  y_pred[66]   2.000000000   4.000000000
  y_pred[67]   2.000000000   4.000000000
  y_pred[68]   2.000000000   4.000000000
  y_pred[69]   2.000000000   4.000000000
  y_pred[70]   2.000000000   4.000000000
  y_pred[71]   2.000000000   4.000000000
  y_pred[72]   2.000000000   4.000000000
  y_pred[73]   2.000000000   4.000000000
  y_pred[74]   2.000000000   4.000000000
  y_pred[75]   2.000000000   4.000000000
  y_pred[76]   2.000000000   4.000000000
  y_pred[77]   2.000000000   4.000000000
  y_pred[78]   2.000000000   4.000000000
  y_pred[79]   2.000000000   4.000000000
  lp__       -11.621549839 -11.350281393

, , chains = chain:4

            stats
parameter            mean          sd         2.5%          25%          50%
  alpha       22.80417187 4.682366625  13.89372617  19.62490452  22.77945018
  beta        -0.01158021 0.002479939  -0.01639852  -0.01323601  -0.01155781
  y_pred[1]    3.98780000 2.016944415   1.00000000   3.00000000   4.00000000
  y_pred[2]    3.95540000 2.004548449   1.00000000   3.00000000   4.00000000
  y_pred[3]    3.87740000 2.026521798   1.00000000   2.00000000   4.00000000
  y_pred[4]    3.82120000 2.037267938   0.00000000   2.00000000   4.00000000
  y_pred[5]    3.71160000 1.998505503   0.00000000   2.00000000   4.00000000
  y_pred[6]    3.70360000 1.980639196   0.00000000   2.00000000   3.00000000
  y_pred[7]    3.64720000 1.948920680   0.00000000   2.00000000   3.00000000
  y_pred[8]    3.60500000 1.943226756   0.00000000   2.00000000   3.00000000
  y_pred[9]    3.54620000 1.928473324   0.00000000   2.00000000   3.00000000
  y_pred[10]   3.51100000 1.899315793   0.00000000   2.00000000   3.00000000
  y_pred[11]   3.48060000 1.848866491   0.00000000   2.00000000   3.00000000
  y_pred[12]   3.36040000 1.844991230   0.00000000   2.00000000   3.00000000
  y_pred[13]   3.30600000 1.843106999   0.00000000   2.00000000   3.00000000
  y_pred[14]   3.33140000 1.852852032   0.00000000   2.00000000   3.00000000
  y_pred[15]   3.21700000 1.799933041   0.00000000   2.00000000   3.00000000
  y_pred[16]   3.17760000 1.815796702   0.00000000   2.00000000   3.00000000
  y_pred[17]   3.19300000 1.783476144   0.00000000   2.00000000   3.00000000
  y_pred[18]   3.10300000 1.795838525   0.00000000   2.00000000   3.00000000
  y_pred[19]   3.15960000 1.770297893   0.00000000   2.00000000   3.00000000
  y_pred[20]   2.98700000 1.744545697   0.00000000   2.00000000   3.00000000
  y_pred[21]   2.99740000 1.776407714   0.00000000   2.00000000   3.00000000
  y_pred[22]   3.04560000 1.750580917   0.00000000   2.00000000   3.00000000
  y_pred[23]   2.99820000 1.762276336   0.00000000   2.00000000   3.00000000
  y_pred[24]   2.92560000 1.720307104   0.00000000   2.00000000   3.00000000
  y_pred[25]   2.88600000 1.725919975   0.00000000   2.00000000   3.00000000
  y_pred[26]   2.81820000 1.725961920   0.00000000   2.00000000   3.00000000
  y_pred[27]   2.76140000 1.676494010   0.00000000   2.00000000   3.00000000
  y_pred[28]   2.77620000 1.679606435   0.00000000   2.00000000   3.00000000
  y_pred[29]   2.74240000 1.656680765   0.00000000   2.00000000   3.00000000
  y_pred[30]   2.73780000 1.668414781   0.00000000   2.00000000   3.00000000
  y_pred[31]   2.69600000 1.629145427   0.00000000   2.00000000   3.00000000
  y_pred[32]   2.66520000 1.612336468   0.00000000   2.00000000   2.00000000
  y_pred[33]   2.63060000 1.648116166   0.00000000   1.00000000   2.00000000
  y_pred[34]   2.62920000 1.639464830   0.00000000   1.00000000   2.00000000
  y_pred[35]   2.53320000 1.589780878   0.00000000   1.00000000   2.00000000
  y_pred[36]   2.52440000 1.598848305   0.00000000   1.00000000   2.00000000
  y_pred[37]   2.50620000 1.608377734   0.00000000   1.00000000   2.00000000
  y_pred[38]   2.45180000 1.587507731   0.00000000   1.00000000   2.00000000
  y_pred[39]   2.42460000 1.565504711   0.00000000   1.00000000   2.00000000
  y_pred[40]   2.43560000 1.565868139   0.00000000   1.00000000   2.00000000
  y_pred[41]   2.36740000 1.552707126   0.00000000   1.00000000   2.00000000
  y_pred[42]   2.37960000 1.543366527   0.00000000   1.00000000   2.00000000
  y_pred[43]   2.33780000 1.526485241   0.00000000   1.00000000   2.00000000
  y_pred[44]   2.29120000 1.514219645   0.00000000   1.00000000   2.00000000
  y_pred[45]   2.19780000 1.514177568   0.00000000   1.00000000   2.00000000
  y_pred[46]   2.17620000 1.487950389   0.00000000   1.00000000   2.00000000
  y_pred[47]   2.13900000 1.450137780   0.00000000   1.00000000   2.00000000
  y_pred[48]   2.08780000 1.440870008   0.00000000   1.00000000   2.00000000
  y_pred[49]   2.02340000 1.430685784   0.00000000   1.00000000   2.00000000
  y_pred[50]   2.03200000 1.460343289   0.00000000   1.00000000   2.00000000
  y_pred[51]   2.00000000 1.422253339   0.00000000   1.00000000   2.00000000
  y_pred[52]   1.89920000 1.378049043   0.00000000   1.00000000   2.00000000
  y_pred[53]   1.94420000 1.417423076   0.00000000   1.00000000   2.00000000
  y_pred[54]   1.89140000 1.395276101   0.00000000   1.00000000   2.00000000
  y_pred[55]   1.85860000 1.365642353   0.00000000   1.00000000   2.00000000
  y_pred[56]   1.79680000 1.342635578   0.00000000   1.00000000   2.00000000
  y_pred[57]   1.71940000 1.340232399   0.00000000   1.00000000   2.00000000
  y_pred[58]   1.67940000 1.320819652   0.00000000   1.00000000   1.00000000
  y_pred[59]   1.63400000 1.295677335   0.00000000   1.00000000   1.00000000
  y_pred[60]   1.59980000 1.275603933   0.00000000   1.00000000   1.00000000
  y_pred[61]   1.59720000 1.277841436   0.00000000   1.00000000   1.00000000
  y_pred[62]   1.59260000 1.265442811   0.00000000   1.00000000   1.00000000
  y_pred[63]   1.55560000 1.257785771   0.00000000   1.00000000   1.00000000
  y_pred[64]   1.54860000 1.266317003   0.00000000   1.00000000   1.00000000
  y_pred[65]   1.53800000 1.250307424   0.00000000   1.00000000   1.00000000
  y_pred[66]   1.52140000 1.253417828   0.00000000   1.00000000   1.00000000
  y_pred[67]   1.48600000 1.238753770   0.00000000   1.00000000   1.00000000
  y_pred[68]   1.46080000 1.251389849   0.00000000   1.00000000   1.00000000
  y_pred[69]   1.46020000 1.238677853   0.00000000   1.00000000   1.00000000
  y_pred[70]   1.45500000 1.215677002   0.00000000   1.00000000   1.00000000
  y_pred[71]   1.42700000 1.220233090   0.00000000   1.00000000   1.00000000
  y_pred[72]   1.42720000 1.207887395   0.00000000   1.00000000   1.00000000
  y_pred[73]   1.37460000 1.186489103   0.00000000   0.00000000   1.00000000
  y_pred[74]   1.35780000 1.170492704   0.00000000   0.00000000   1.00000000
  y_pred[75]   1.31500000 1.179005094   0.00000000   0.00000000   1.00000000
  y_pred[76]   1.27760000 1.153431541   0.00000000   0.00000000   1.00000000
  y_pred[77]   1.17320000 1.130069543   0.00000000   0.00000000   1.00000000
  y_pred[78]   1.13880000 1.074507084   0.00000000   0.00000000   1.00000000
  y_pred[79]   1.12020000 1.096354121   0.00000000   0.00000000   1.00000000
  lp__       -12.31662538 1.029051463 -15.16789075 -12.66861046 -12.01723170
            stats
parameter              75%         97.5%
  alpha       25.930141013  31.920646899
  beta        -0.009908512  -0.006861711
  y_pred[1]    5.000000000   8.000000000
  y_pred[2]    5.000000000   8.000000000
  y_pred[3]    5.000000000   8.000000000
  y_pred[4]    5.000000000   8.000000000
  y_pred[5]    5.000000000   8.000000000
  y_pred[6]    5.000000000   8.000000000
  y_pred[7]    5.000000000   8.000000000
  y_pred[8]    5.000000000   8.000000000
  y_pred[9]    5.000000000   8.000000000
  y_pred[10]   5.000000000   8.000000000
  y_pred[11]   5.000000000   8.000000000
  y_pred[12]   5.000000000   7.000000000
  y_pred[13]   4.000000000   7.000000000
  y_pred[14]   4.000000000   7.000000000
  y_pred[15]   4.000000000   7.000000000
  y_pred[16]   4.000000000   7.000000000
  y_pred[17]   4.000000000   7.000000000
  y_pred[18]   4.000000000   7.000000000
  y_pred[19]   4.000000000   7.000000000
  y_pred[20]   4.000000000   7.000000000
  y_pred[21]   4.000000000   7.000000000
  y_pred[22]   4.000000000   7.000000000
  y_pred[23]   4.000000000   7.000000000
  y_pred[24]   4.000000000   7.000000000
  y_pred[25]   4.000000000   7.000000000
  y_pred[26]   4.000000000   7.000000000
  y_pred[27]   4.000000000   6.000000000
  y_pred[28]   4.000000000   6.000000000
  y_pred[29]   4.000000000   6.000000000
  y_pred[30]   4.000000000   6.000000000
  y_pred[31]   4.000000000   6.000000000
  y_pred[32]   4.000000000   6.000000000
  y_pred[33]   4.000000000   6.000000000
  y_pred[34]   4.000000000   6.000000000
  y_pred[35]   3.000000000   6.000000000
  y_pred[36]   3.000000000   6.000000000
  y_pred[37]   3.000000000   6.000000000
  y_pred[38]   3.000000000   6.000000000
  y_pred[39]   3.000000000   6.000000000
  y_pred[40]   3.000000000   6.000000000
  y_pred[41]   3.000000000   6.000000000
  y_pred[42]   3.000000000   6.000000000
  y_pred[43]   3.000000000   6.000000000
  y_pred[44]   3.000000000   6.000000000
  y_pred[45]   3.000000000   6.000000000
  y_pred[46]   3.000000000   5.000000000
  y_pred[47]   3.000000000   5.000000000
  y_pred[48]   3.000000000   5.000000000
  y_pred[49]   3.000000000   5.000000000
  y_pred[50]   3.000000000   5.000000000
  y_pred[51]   3.000000000   5.000000000
  y_pred[52]   3.000000000   5.000000000
  y_pred[53]   3.000000000   5.000000000
  y_pred[54]   3.000000000   5.000000000
  y_pred[55]   3.000000000   5.000000000
  y_pred[56]   3.000000000   5.000000000
  y_pred[57]   2.000000000   5.000000000
  y_pred[58]   2.000000000   5.000000000
  y_pred[59]   2.000000000   5.000000000
  y_pred[60]   2.000000000   5.000000000
  y_pred[61]   2.000000000   5.000000000
  y_pred[62]   2.000000000   4.025000000
  y_pred[63]   2.000000000   4.000000000
  y_pred[64]   2.000000000   4.000000000
  y_pred[65]   2.000000000   4.000000000
  y_pred[66]   2.000000000   4.000000000
  y_pred[67]   2.000000000   4.000000000
  y_pred[68]   2.000000000   4.000000000
  y_pred[69]   2.000000000   4.000000000
  y_pred[70]   2.000000000   4.000000000
  y_pred[71]   2.000000000   4.000000000
  y_pred[72]   2.000000000   4.000000000
  y_pred[73]   2.000000000   4.000000000
  y_pred[74]   2.000000000   4.000000000
  y_pred[75]   2.000000000   4.000000000
  y_pred[76]   2.000000000   4.000000000
  y_pred[77]   2.000000000   4.000000000
  y_pred[78]   2.000000000   4.000000000
  y_pred[79]   2.000000000   4.000000000
  lp__       -11.603849353 -11.345792048
# Extraer las estimaciones de los parámetros
estimaciones_desastres <- extract(fit_muertes)
alpha_est <- estimaciones_desastres$alpha
beta_est <- estimaciones_desastres$beta

# Extraer las predicciones generadas por el modelo para los años observados
y_pred_desastres <- estimaciones_desastres$y_pred

# Calcular la media y la desviación estándar de las predicciones
mean_pred_desastres <- colMeans(y_pred_desastres)
sd_pred_desastres <- apply(y_pred_desastres, 2, sd)

# Mostrar los resultados para las predicciones de desastres por año
cat("La media predictiva de desastres por año es:", mean(mean_pred_desastres), "\n")
La media predictiva de desastres por año es: 2.418924 
cat("La desviación estándar predictiva de desastres por año es:", mean(sd_pred_desastres), "\n")
La desviación estándar predictiva de desastres por año es: 1.55199 
Gráficas e interpretación
# Trace plot de las cadenas MCMC para alpha y beta
mcmc_trace(as.array(fit_muertes), pars = c("alpha", "beta"))

# Densidades posteriores de alpha y beta
mcmc_dens_overlay(as.array(fit_muertes), pars = c("alpha", "beta"))

# Relación entre año y probabilidad de desastre
ggplot(data = data.frame(x = years, y = counts), aes(x = x, y = y)) +
  geom_point() +
  stat_smooth(method = "glm", method.args = list(family = "poisson"), se = FALSE)
`geom_smooth()` using formula = 'y ~ x'

# Intervalos de credibilidad para alpha y beta
mcmc_intervals(as.array(fit_muertes), pars = c("alpha", "beta"))

Basándonos en el modelo estadístico, estimamos que la mina experimenta en promedio 2.42 desastres al año, con una variabilidad típica que podría resultar en un rango de 0.86 a 3.98 desastres en un año dado. Estos números ayudarán a la compañía de seguros a evaluar los riesgos y a planificar adecuadamente sus pólizas y estrategias de mitigación.

  • alpha:
    • Valor estimado: 22.92
    • Intervalo de credibilidad del 95%: [13.23, 32.53]
    • Interpretación: El parámetro alpha representa el nivel de desastres en la mina cuando todas las demás variables se mantienen constantes. Según el modelo, estimamos que el nivel basal de desastres en la mina es aproximadamente 22.92. Sin embargo, dado el intervalo de credibilidad del 95%, podemos estar aproximadamente 95% seguros de que el verdadero valor de alpha cae en el rango de 13.23, 32.53, lo cual es bastante amplio y refleja la incertidumbre.
  • beta:
    • Valor estimado: -0.01
    • Intervalo de credibilidad del 95%: [-0.02, -0.01]
    • Interpretación: El parámetro beta representa la relación entre el tiempo (año) y la tasa de desastres en la mina. Según este modelo, estimamos que por cada aumento unitario en el año, la tasa de desastres en la mina disminuye en aproximadamente 0.01 unidades. El intervalo de credibilidad del 95% sugiere que podemos estar aproximadamente 95% seguros de que la verdadera relación entre el tiempo y la tasa de desastres cae en el rango de -0.02 a -0.01.
  • La media predictiva de desastres por año es: 2.416199
  • La desviación estándar predictiva de desastres por año es: 1.550927
# PPC

# Extraer las muestras de los parámetros del modelo
samples <- extract(fit_muertes)

# Definir la función inv_logit
inv_logit <- function(x) {
  exp(x) / (1 + exp(x))
}

# Generar predicciones posteriores para cada observación
n_obs <- length(years)  # Número de observaciones
n_samples <- dim(samples$alpha)[1]   # Número de muestras en la cadena MCMC
yrep <- matrix(NA, nrow = n_obs, ncol = n_samples)  # Matriz para almacenar predicciones

for (i in 1:n_samples) {
  mu <- samples$alpha[i] + samples$beta[i] * years
  yrep[, i] <- rpois(n_obs, exp(mu))  # Muestreo de distribución de Poisson usando la media de mu
}

# Calcular la media de las predicciones
predicciones_media <- apply(yrep, 1, mean)

# Calcular los límites del gráfico
x_limits <- range(c(counts, predicciones_media))
y_limits <- range(c(counts, predicciones_media))

# Crear el gráfico de dispersión con los límites ajustados
plot(counts, predicciones_media,
     xlab = "Valores Observados",
     ylab = "Media de Valores Predichos",
     main = "PPC: Valores Observados vs. Media de Valores Predichos",
     xlim = x_limits,
     ylim = y_limits)

# Añadir una línea roja con pendiente positiva 
abline(a = 0, b = 1, col = "red")

Observamos que el modelo no se ajusta adecuadamente a los datos. Esto sugiere que el modelo no captura completamente la estructura subyacente de los desastres en la mina a lo largo del tiempo. Es posible que existan otros factores no considerados en el modelo que influyan en la ocurrencia de desastres, o que la relación entre el tiempo y la frecuencia de desastres no sea lineal como se asumió en el modelo.

Inciso ii)
# El parámetro tau es el punto de inflexión que indica el año en el que cambia la tasa de desastres. Antes de este punto, la tasa es determinada por alpha (b0), y después de este punto, la tasa se incrementa en beta (b1).

library(rstan)
library(boot)

# Cargar los datos
data(coal)

# Convertir las fechas a años completos desde 1851 y contar los desastres por año
disaster_years <- as.integer(floor(coal$date))
disaster_counts <- table(disaster_years)

# Generar un vector de todos los años desde el primer hasta el último año registrado
all_years <- seq(from = min(disaster_years), to = max(disaster_years))

# Inicializar un vector para contar los desastres, llenándolo con ceros para todos los años
counts_full <- setNames(rep(0, length(all_years)), all_years)

# Actualizar los conteos basados en los desastres registrados
counts_full[names(disaster_counts)] <- as.integer(disaster_counts)

# Convertir los conteos a un vector numérico sin nombres
counts <- as.integer(counts_full)

# Preparar los datos para Stan incluyendo min_year y max_year
stan_data <- list(
  N = length(all_years),
  disasters = counts,
  years = as.numeric(names(counts_full)),
  min_year = as.numeric(min(all_years)),  # Añadiendo min_year
  max_year = as.numeric(max(all_years))   # Añadiendo max_year
)

# Asegúrate de ajustar la ruta al archivo Stan
stan_model_path <- "Ej5-modelo3.stan"

# Compilar y ajustar el modelo Stan
stan_model <- stan_model(file = stan_model_path)
fit <- sampling(stan_model, data = stan_data, iter = 10000, warmup = 5000, chains = 4)

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 2.4e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.24 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 1: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 1: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 1: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 1: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 1: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 1: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 1: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 1: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 1: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 1: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 1: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 45.103 seconds (Warm-up)
Chain 1:                45.518 seconds (Sampling)
Chain 1:                90.621 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 1.1e-05 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.11 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 2: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 2: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 2: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 2: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 2: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 2: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 2: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 2: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 2: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 2: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 2: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 44.792 seconds (Warm-up)
Chain 2:                45.556 seconds (Sampling)
Chain 2:                90.348 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 9e-06 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 3: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 3: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 3: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 3: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 3: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 3: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 3: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 3: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 3: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 3: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 3: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 45.349 seconds (Warm-up)
Chain 3:                45.943 seconds (Sampling)
Chain 3:                91.292 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 1.1e-05 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.11 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 10000 [  0%]  (Warmup)
Chain 4: Iteration: 1000 / 10000 [ 10%]  (Warmup)
Chain 4: Iteration: 2000 / 10000 [ 20%]  (Warmup)
Chain 4: Iteration: 3000 / 10000 [ 30%]  (Warmup)
Chain 4: Iteration: 4000 / 10000 [ 40%]  (Warmup)
Chain 4: Iteration: 5000 / 10000 [ 50%]  (Warmup)
Chain 4: Iteration: 5001 / 10000 [ 50%]  (Sampling)
Chain 4: Iteration: 6000 / 10000 [ 60%]  (Sampling)
Chain 4: Iteration: 7000 / 10000 [ 70%]  (Sampling)
Chain 4: Iteration: 8000 / 10000 [ 80%]  (Sampling)
Chain 4: Iteration: 9000 / 10000 [ 90%]  (Sampling)
Chain 4: Iteration: 10000 / 10000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 45.183 seconds (Warm-up)
Chain 4:                46.234 seconds (Sampling)
Chain 4:                91.417 seconds (Total)
Chain 4: 
Warning: There were 19553 transitions after warmup that exceeded the maximum treedepth. Increase max_treedepth above 10. See
https://mc-stan.org/misc/warnings.html#maximum-treedepth-exceeded
Warning: Examine the pairs() plot to diagnose sampling problems
# Imprimir los resultados
print(fit)
Inference for Stan model: anon_model.
4 chains, each with iter=10000; warmup=5000; thin=1; 
post-warmup draws per chain=5000, total post-warmup draws=20000.

         mean se_mean   sd    2.5%     25%     50%     75%   97.5% n_eff Rhat
beta0    1.14    0.00 0.10    0.94    1.07    1.14    1.20    1.32   830    1
beta1   -1.23    0.01 0.16   -1.53   -1.33   -1.22   -1.12   -0.93   888    1
tau   1890.49    0.08 2.45 1886.15 1889.15 1890.69 1891.78 1896.44   985    1
lp__   -52.58    0.03 1.33  -56.02  -53.17  -52.26  -51.64  -50.92  1954    1

Samples were drawn using NUTS(diag_e) at Sun Mar 17 13:41:26 2024.
For each parameter, n_eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor on split chains (at 
convergence, Rhat=1).
summary(fit)
$summary
             mean     se_mean         sd         2.5%         25%         50%
beta0    1.137130 0.003339836 0.09620865    0.9449748    1.073868    1.138279
beta1   -1.227894 0.005203429 0.15504598   -1.5328288   -1.331788   -1.224919
tau   1890.490814 0.077979534 2.44762142 1886.1502010 1889.149921 1890.685107
lp__   -52.582309 0.030118546 1.33126846  -56.0218961  -53.170964  -52.256975
              75%        97.5%     n_eff     Rhat
beta0    1.202659    1.3229838  829.8088 1.004990
beta1   -1.120500   -0.9321528  887.8543 1.002887
tau   1891.779540 1896.4432010  985.2064 1.001399
lp__   -51.642299  -50.9202836 1953.7243 1.001276

$c_summary
, , chains = chain:1

         stats
parameter        mean         sd         2.5%         25%         50%
    beta0    1.125890 0.09541181    0.9275877    1.063588    1.128601
    beta1   -1.215192 0.15317930   -1.5206998   -1.315154   -1.211544
    tau   1890.588482 2.44874578 1886.1707533 1889.250091 1890.776933
    lp__   -52.584476 1.38167342  -56.3268560  -53.152246  -52.225632
         stats
parameter         75%        97.5%
    beta0    1.191649    1.3045067
    beta1   -1.112432   -0.9204617
    tau   1891.812235 1896.4810669
    lp__   -51.615335  -50.9059227

, , chains = chain:2

         stats
parameter        mean         sd        2.5%         25%         50%
    beta0    1.135319 0.09847063    0.927935    1.071463    1.137902
    beta1   -1.222211 0.15577064   -1.521014   -1.326764   -1.222809
    tau   1890.349316 2.38646266 1886.154663 1889.048578 1890.540380
    lp__   -52.594765 1.33067770  -56.029261  -53.176696  -52.255682
         stats
parameter         75%        97.5%
    beta0    1.201819    1.3239472
    beta1   -1.117065   -0.9163291
    tau   1891.723355 1896.2771001
    lp__   -51.646078  -50.9263593

, , chains = chain:3

         stats
parameter        mean         sd         2.5%         25%         50%
    beta0    1.149711 0.08955377    0.9746882    1.088991    1.149426
    beta1   -1.244208 0.14971751   -1.5362948   -1.349166   -1.244195
    tau   1890.498737 2.35974968 1886.1397598 1889.191488 1890.735420
    lp__   -52.517202 1.22098471  -55.5892678  -53.127334  -52.247847
         stats
parameter         75%        97.5%
    beta0    1.209788    1.3253952
    beta1   -1.137004   -0.9685627
    tau   1891.807803 1896.1171019
    lp__   -51.615856  -50.9070188

, , chains = chain:4

         stats
parameter        mean         sd         2.5%         25%         50%
    beta0    1.137602 0.09960948    0.9481158    1.070399    1.136723
    beta1   -1.229965 0.15988609   -1.5497242   -1.334760   -1.224349
    tau   1890.526722 2.58381508 1886.1295222 1889.115144 1890.660063
    lp__   -52.632795 1.38300009  -56.1668227  -53.244609  -52.302362
         stats
parameter         75%       97.5%
    beta0    1.204941    1.333438
    beta1   -1.118363   -0.930754
    tau   1891.771993 1896.673267
    lp__   -51.686628  -50.940586

beta0 (Intercepto):

  • Media: La media de beta0 es aproximadamente 1.137, lo que indica el valor esperado del logaritmo de la tasa de desastres en el año base (antes del punto de cambio tau).

  • Desviación estándar (sd): Una desviación estándar de 0.096 sugiere una variabilidad moderada en las estimaciones de beta0.

  • IC 95%: El intervalo de credibilidad al 95% va desde aproximadamente 0.945 hasta 1.323, indicando dónde se espera que caiga el verdadero valor de beta0 el 95% del tiempo, según el modelo.

  • Convergencia: Un Rhat de 1.004990 está muy cerca de 1, lo que generalmente indica una buena convergencia. n_eff de aproximadamente 830 es un tamaño de muestra efectivo aceptable, aunque no tan grande como el ejemplo proporcionado.

beta1 (Pendiente después del punto de cambio tau):

  • Media: La media de beta1 es aproximadamente -1.228, lo que sugiere una disminución en la tasa logarítmica de desastres después del punto de cambio tau.

  • Desviación estándar (sd): Una desviación estándar de 0.155 indica una variabilidad moderada en las estimaciones de beta1.

  • IC 95%: El intervalo de credibilidad al 95% va desde aproximadamente -1.533 hasta -0.932, lo cual no cruza cero y sugiere que hay un efecto significativo de la variable tiempo en la tasa de desastres después de tau.

  • Convergencia: El Rhat de 1.002887 es muy cercano a 1, lo que indica buena convergencia. n_eff de aproximadamente 888 es adecuado, lo que sugiere que las estimaciones son confiables.

tau (Año de cambio):

  • Media: La media de tau es aproximadamente 1890.49, lo que sugiere que el cambio en la tasa de desastres se centra alrededor de este año.

  • Desviación estándar (sd): Una desviación estándar de 2.448 indica que hay una cierta incertidumbre acerca del año exacto en que ocurrió el cambio, pero esta incertidumbre no es excesivamente grande.

  • IC 95%: El intervalo de credibilidad al 95% va desde aproximadamente 1886.15 hasta 1896.44, proporcionando una ventana de tiempo dentro de la cual es probable que haya ocurrido el cambio.

  • Convergencia: Un Rhat de 1.001399 y un n_eff de aproximadamente 985 sugieren que el muestreo ha convergido bien y que las estimaciones de tau son robustas.

mcmc_trace(fit, pars = c("beta0", "beta1", "tau"))

mcmc_areas(fit, pars = c("beta0", "beta1", "tau"))

mcmc_pairs(fit, pars = c("beta0", "beta1", "tau"))

mcmc_acf(fit, pars = c("beta0", "beta1", "tau"))
Warning: The `facets` argument of `facet_grid()` is deprecated as of ggplot2 2.2.0.
ℹ Please use the `rows` argument instead.
ℹ The deprecated feature was likely used in the bayesplot package.
  Please report the issue at <https://github.com/stan-dev/bayesplot/issues/>.

mcmc_intervals(fit, pars = c("beta0", "beta1", "tau"))